ruby_llm-mcp 0.5.1 → 0.6.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 871fadee0f0166df56c6e8ef4ce2a6598c7875af9ca05631687cab1966ccb4e9
4
- data.tar.gz: 6cbefa984fd7b541005be0f3f05c3970ab511c90afec9d135212dca33127d992
3
+ metadata.gz: dd3795fdab2bd6822d9f11c5b747f291b91d46148fb1f6e9bbb3644b6c7e98a4
4
+ data.tar.gz: 75c7105d9c276c74c6eff97d497da5600af21e66add76e01c8c9a5c79e5f4477
5
5
  SHA512:
6
- metadata.gz: 43b1fcb2d2fd8d6d1ebf7b8a43266443e72be5551ae9866a565bd38f7a90b98832294aadadb562d58a82dae24ff72251fe9eda91aab06a75df061413885c0560
7
- data.tar.gz: 9a72ae094803de509b3ec447a43899a001ebd857d7f1085cb31f36a923d189324f7766ed37249866f0fc38e82fb962c11b00737f6821a7e6d723c6a30aa537aa
6
+ metadata.gz: 57ba084ac16260e889e84a028ab10366093478996f5d6218e2b794c96b927b1e5dc2164711a634b7954d3886eb938668119651a0498b29efde6418c9777e29c3
7
+ data.tar.gz: ca54b9f680884e3f6e6c7e2251f6416da0c28346763de35d59ba90c56e3b6d1ba19ce76f170c11554353dae81087275b71d0a4a319517107a2b41fdaad65b3af
data/README.md CHANGED
@@ -1,18 +1,25 @@
1
- # RubyLLM::MCP
1
+ <img src="/docs/assets/images/rubyllm-mcp-logo-text.svg" alt="RubyLLM" height="120" width="250">
2
2
 
3
- Aiming to make using MCPs with RubyLLM as easy as possible.
3
+ **Aiming to make using MCPs with RubyLLM and Ruby as easy as possible.**
4
4
 
5
5
  This project is a Ruby client for the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/), designed to work seamlessly with [RubyLLM](https://github.com/crmne/ruby_llm). This gem enables Ruby applications to connect to MCP servers and use their tools, resources and prompts as part of LLM conversations.
6
6
 
7
- **Note:** This project is still under development and the API is subject to change.
7
+ For a more detailed guide, see the [RubyLLM::MCP docs](https://rubyllm-mcp.com/).
8
8
 
9
- ## Features
9
+ Currently full support for MCP protocol version up to `2025-06-18`.
10
+
11
+ <div class="badge-container">
12
+ <a href="https://badge.fury.io/rb/ruby_llm-mcp"><img src="https://badge.fury.io/rb/ruby_llm-mcp.svg" alt="Gem Version" /></a>
13
+ <a href="https://rubygems.org/gems/ruby_llm-mcp"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm-mcp"></a>
14
+ </div>
15
+
16
+ ## RubyLLM::MCP Features
10
17
 
11
18
  - šŸ”Œ **Multiple Transport Types**: Streamable HTTP, and STDIO and legacy SSE transports
12
19
  - šŸ› ļø **Tool Integration**: Automatically converts MCP tools into RubyLLM-compatible tools
13
20
  - šŸ“„ **Resource Management**: Access and include MCP resources (files, data) and resource templates in conversations
14
21
  - šŸŽÆ **Prompt Integration**: Use predefined MCP prompts with arguments for consistent interactions
15
- - šŸŽ›ļø **Client Features**: Support for sampling and roots
22
+ - šŸŽ›ļø **Client Features**: Support for sampling, roots, and elicitation
16
23
  - šŸŽØ **Enhanced Chat Interface**: Extended RubyLLM chat methods for seamless MCP integration
17
24
  - šŸ”„ **Multiple Client Management**: Create and manage multiple MCP clients simultaneously for different servers and purposes
18
25
  - šŸ“š **Simple API**: Easy-to-use interface that integrates seamlessly with RubyLLM
@@ -105,62 +112,16 @@ response = chat.ask("Can you help me search for recent files in my project?")
105
112
  puts response
106
113
  ```
107
114
 
108
- ### Human in the Loop
109
-
110
- You can use the `on_human_in_the_loop` callback to allow the human to intervene in the tool call. This is useful for tools that require human input or programic input to verify if the tool should be executed.
111
-
112
- For tool calls that have access to do important operations, there SHOULD always be a human in the loop with the ability to deny tool invocations.
113
-
114
- ```ruby
115
- client.on_human_in_the_loop do |name, params|
116
- name == "add" && params[:a] == 1 && params[:b] == 2
117
- end
118
-
119
- tool = client.tool("add")
120
- result = tool.execute(a: 1, b: 2)
121
- puts result # 3
122
-
123
- # If the human in the loop returns false, the tool call will be cancelled
124
- result = tool.execute(a: 2, b: 2)
125
- puts result # Tool execution error: Tool call was cancelled by the client
126
-
127
- tool = client.tool("add")
128
- result = tool.execute(a: 1, b: 2)
129
- puts result
130
- ```
131
-
132
- ### Support Complex Parameters
133
-
134
- If you want to support complex parameters, like an array of objects it currently requires a patch to RubyLLM itself. This is planned to be temporary until the RubyLLM is updated.
135
-
136
- ```ruby
137
- RubyLLM::MCP.support_complex_parameters!
138
- ```
139
-
140
- ### Streaming Responses with Tool Calls
141
-
142
- ```ruby
143
- chat = RubyLLM.chat(model: "gpt-4")
144
- chat.with_tools(*client.tools)
145
-
146
- chat.ask("Analyze my project structure") do |chunk|
147
- if chunk.tool_call?
148
- chunk.tool_calls.each do |key, tool_call|
149
- puts "\nšŸ”§ Using tool: #{tool_call.name}"
150
- end
151
- else
152
- print chunk.content
153
- end
154
- end
155
- ```
156
-
157
115
  ### Manual Tool Execution
158
116
 
159
117
  You can also execute MCP tools directly:
160
118
 
161
119
  ```ruby
120
+ # Tools Execution
121
+ tool = client.tool("search_files")
122
+
162
123
  # Execute a specific tool
163
- result = client.execute_tool(
124
+ result = tool.execute(
164
125
  name: "search_files",
165
126
  parameters: {
166
127
  query: "*.rb",
@@ -256,594 +217,16 @@ chat.with_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
256
217
  response = chat.ask("Continue with the greeting")
257
218
  ```
258
219
 
259
- ### Combining Resources, Prompts, and Tools
260
-
261
- You can combine all MCP features for powerful conversations:
262
-
263
- ```ruby
264
- client = RubyLLM::MCP.client(
265
- name: "development-assistant",
266
- transport_type: :sse,
267
- config: { url: "http://localhost:9292/mcp/sse" }
268
- )
269
-
270
- chat = RubyLLM.chat(model: "gpt-4")
271
-
272
- # Add tools for capabilities
273
- chat.with_tools(*client.tools)
274
-
275
- # Add resources for context
276
- chat.with_resource(client.resource("project_structure"))
277
- chat.with_resource(
278
- client.resource_template("recent_commits"),
279
- arguments: { days: 7 }
280
- )
281
-
282
- # Add prompts for guidance
283
- chat.with_prompt(
284
- client.prompt("code_review_checklist"),
285
- arguments: { focus: "security" }
286
- )
287
-
288
- # Now ask for analysis
289
- response = chat.ask("Please review the recent commits using the checklist and suggest improvements")
290
- puts response
291
- ```
292
-
293
- ### Argument Completion
294
-
295
- Some MCP servers support argument completion for prompts and resource templates:
296
-
297
- ```ruby
298
- # For prompts
299
- prompt = client.prompt("user_search")
300
- suggestions = prompt.complete("username", "jo")
301
- puts "Suggestions: #{suggestions.values}" # ["john", "joanna", "joseph"]
302
-
303
- # For resource templates
304
- template = client.resource_template("user_logs")
305
- suggestions = template.complete("user_id", "123")
306
- puts "Total matches: #{suggestions.total}"
307
- puts "Has more results: #{suggestions.has_more}"
308
- ```
309
-
310
- ### Pagination
311
-
312
- MCP servers can support pagination for their lists. The client will automatically paginate the lists to include all items from the list you wanted to pull.
313
-
314
- Pagination is supported for tools, resources, prompts, and resource templates.
315
-
316
- ### Additional Chat Methods
317
-
318
- The gem extends RubyLLM's chat interface with convenient methods for MCP integration:
319
-
320
- ```ruby
321
- chat = RubyLLM.chat(model: "gpt-4")
322
-
323
- # Add a single resource
324
- chat.with_resource(resource)
325
-
326
- # Add multiple resources
327
- chat.with_resources(resource1, resource2, resource3)
328
-
329
- # Add a resource template with arguments
330
- chat.with_resource_template(resource_template, arguments: { key: "value" })
331
-
332
- # Add a prompt with arguments
333
- chat.with_prompt(prompt, arguments: { name: "Alice" })
334
-
335
- # Ask using a prompt directly
336
- response = chat.ask_prompt(prompt, arguments: { name: "Alice" })
337
- ```
338
-
339
- ## Rails Integration
340
-
341
- RubyLLM MCP provides seamless Rails integration through a Railtie and generator system.
220
+ ## Development
342
221
 
343
- ### Setup
222
+ After checking out the repo, run `bundle` to install dependencies. Then, run `bundle exec rake` to run the tests. Tests currently use `bun` to run test MCP servers You can also run `bin/console` for an interactive prompt that will allow you to experiment.
344
223
 
345
- Generate the configuration files:
224
+ There are also examples you you can run to verify the gem is working as expected.
346
225
 
347
226
  ```bash
348
- rails generate ruby_llm:mcp:install
349
- ```
350
-
351
- This creates:
352
-
353
- - `config/initializers/ruby_llm_mcp.rb` - Main configuration
354
- - `config/mcps.yml` - MCP servers configuration
355
-
356
- ### MCP Server Configuration
357
-
358
- Configure your MCP servers in `config/mcps.yml`:
359
-
360
- ```yaml
361
- mcp_servers:
362
- filesystem:
363
- transport_type: stdio
364
- command: npx
365
- args:
366
- - "@modelcontextprotocol/server-filesystem"
367
- - "<%= Rails.root %>"
368
- env: {}
369
- with_prefix: true
370
-
371
- api_server:
372
- transport_type: sse
373
- url: "https://api.example.com/mcp/sse"
374
- headers:
375
- Authorization: "Bearer <%= ENV['API_TOKEN'] %>"
376
- ```
377
-
378
- ### Automatic Client Management
379
-
380
- With `launch_control: :automatic`, Rails will:
381
-
382
- - Start all configured MCP clients when the application initializes
383
- - Gracefully shut down clients when the application exits
384
- - Handle client lifecycle automatically
385
-
386
- However, it's very command to due to the performace of LLM calls that are made in the background.
387
-
388
- For this, we recommend using `launch_control: :manual` and use `establish_connection` method to manage the client lifecycle manually inside your background jobs. It will provide you active connections to the MCP servers, and take care of closing them when the job is done.
389
-
390
- ```ruby
391
- RubyLLM::MCP.establish_connection do |clients|
392
- chat = RubyLLM.chat(model: "gpt-4")
393
- chat.with_tools(*clients.tools)
394
-
395
- response = chat.ask("Hello, world!")
396
- puts response
397
- end
398
- ```
399
-
400
- You can also avoid this completely manually start and stop the clients if you so choose.
401
-
402
- If you want to use the clients outside of the block, you can use the `clients` method to get the clients.
403
-
404
- ```ruby
405
- clients = RubyLLM::MCP.establish_connection
406
- chat = RubyLLM.chat(model: "gpt-4")
407
- chat.with_tools(*clients.tools)
408
-
409
- response = chat.ask("Hello, world!")
410
- puts response
411
- ```
412
-
413
- However, you will be responsible for closing the connection when you are done with it.
414
-
415
- ```ruby
416
- RubyLLM::MCP.close_connection
417
- ```
418
-
419
- ## Client Lifecycle Management
420
-
421
- You can manage the MCP client connection lifecycle:
422
-
423
- ```ruby
424
- client = RubyLLM::MCP.client(name: "my-server", transport_type: :stdio, start: false, config: {...})
425
-
426
- # Manually start the connection
427
- client.start
428
-
429
- # Check if connection is alive
430
- puts client.alive?
431
-
432
- # Restart the connection
433
- client.restart!
434
-
435
- # Stop the connection
436
- client.stop
437
- ```
438
-
439
- ### Ping
440
-
441
- You can ping the MCP server to check if it is alive:
442
-
443
- ```ruby
444
- client.ping # => true or false
445
- ```
446
-
447
- ## Refreshing Cached Data
448
-
449
- The client caches tools, resources, prompts, and resource templates list calls are cached to reduce round trips back to the MCP server. You can refresh this cache:
450
-
451
- ```ruby
452
- # Refresh all cached tools
453
- tools = client.tools(refresh: true)
454
-
455
- # Refresh a specific tool
456
- tool = client.tool("search_files", refresh: true)
457
-
458
- # Same pattern works for resources, prompts, and resource templates
459
- resources = client.resources(refresh: true)
460
- prompts = client.prompts(refresh: true)
461
- templates = client.resource_templates(refresh: true)
462
-
463
- # Or refresh specific items
464
- resource = client.resource("project_readme", refresh: true)
465
- prompt = client.prompt("daily_greeting", refresh: true)
466
- template = client.resource_template("user_logs", refresh: true)
467
- ```
468
-
469
- ## Notifications
470
-
471
- MCPs can produce notifications that happen in an async nature outside normal calls to the MCP server.
472
-
473
- ### Subscribing to a Resource Update
474
-
475
- By default, the client will look for any resource cha to resource updates and refresh the resource content when it changes.
476
-
477
- ### Logging Notifications
478
-
479
- MCPs can produce logging notifications for long-running tool operations. Logging notifications allow tools to send real-time updates about their execution status.
480
-
481
- ```ruby
482
- client.on_logging do |logging|
483
- puts "Logging: #{logging.level} - #{logging.message}"
484
- end
485
-
486
- # Execute a tool that supports logging notifications
487
- tool = client.tool("long_running_operation")
488
- result = tool.execute(operation: "data_processing")
489
-
490
- # Logging: info - Processing data...
491
- # Logging: info - Processing data...
492
- # Logging: warning - Something went wrong but not major...
493
- ```
494
-
495
- Different levels of logging are supported:
496
-
497
- ```ruby
498
- client.on_logging(RubyLLM::MCP::Logging::WARNING) do |logging|
499
- puts "Logging: #{logging.level} - #{logging.message}"
500
- end
501
-
502
- # Execute a tool that supports logging notifications
503
- tool = client.tool("long_running_operation")
504
- result = tool.execute(operation: "data_processing")
505
-
506
- # Logging: warning - Something went wrong but not major...
227
+ bundle exec ruby examples/tools/local_mcp.rb
507
228
  ```
508
229
 
509
- ### Progress Notifications
510
-
511
- MCPs can produce progress notifications for long-running tool operations. Progress notifications allow tools to send real-time updates about their execution status.
512
-
513
- **Note:** that we only support progress notifications for tool calls today.
514
-
515
- ```ruby
516
- # Set up progress tracking
517
- client.on_progress do |progress|
518
- puts "Progress: #{progress.progress}% - #{progress.message}"
519
- end
520
-
521
- # Execute a tool that supports progress notifications
522
- tool = client.tool("long_running_operation")
523
- result = tool.execute(operation: "data_processing")
524
-
525
- # Progress 25% - Processing data...
526
- # Progress 50% - Processing data...
527
- # Progress 75% - Processing data...
528
- # Progress 100% - Processing data...
529
- puts result
530
-
531
- # Result: { status: "success", data: "Processed data" }
532
- ```
533
-
534
- ## Client Features
535
-
536
- The RubyLLM::MCP client provides support functionality that can be exposed to MCP servers. These features must be explicitly configured before creating client objects to ensure you're opting into this functionality.
537
-
538
- ### Roots
539
-
540
- Roots provide MCP servers with access to underlying file system information. The implementation starts with a lightweight approach due to the MCP specification's current limitations on root usage.
541
-
542
- When roots are configured, the client will:
543
-
544
- - Expose roots as a supported capability to MCP servers
545
- - Support dynamic addition and removal of roots during the client lifecycle
546
- - Fire `notifications/roots/list_changed` events when roots are modified
547
-
548
- #### Configuration
549
-
550
- ```ruby
551
- RubyLLM::MCP.config do |config|
552
- config.roots = ["to/a/path", Rails.root]
553
- end
554
-
555
- client = RubyLLM::MCP::Client.new(...)
556
- ```
557
-
558
- #### Usage
559
-
560
- ```ruby
561
- # Access current root paths
562
- client.roots.paths
563
- # => ["to/a/path", #<Pathname:/to/rails/root/path>]
564
-
565
- # Add a new root (fires list_changed notification)
566
- client.roots.add("new/path")
567
- client.roots.paths
568
- # => ["to/a/path", #<Pathname:/to/rails/root/path>, "new/path"]
569
-
570
- # Remove a root (fires list_changed notification)
571
- client.roots.remove("to/a/path")
572
- client.roots.paths
573
- # => [#<Pathname:/to/rails/root/path>, "new/path"]
574
- ```
575
-
576
- ### Sampling
577
-
578
- Sampling allows MCP servers to offload LLM requests to the MCP client rather than making them directly from the server. This enables MCP servers to optionally use LLM connections through the client.
579
-
580
- #### Configuration
581
-
582
- ```ruby
583
- RubyLLM::MCP.configure do |config|
584
- config.sampling.enabled = true
585
- config.sampling.preferred_model = "gpt-4.1"
586
-
587
- # Optional: Use a block for dynamic model selection
588
- config.sampling.preferred_model do |model_preferences|
589
- model_preferences.hints.first
590
- end
591
-
592
- # Optional: Add guards to filter sampling requests
593
- config.sampling.guard do |sample|
594
- sample.message.include("Hello")
595
- end
596
- end
597
- ```
598
-
599
- #### How It Works
600
-
601
- With the above configuration:
602
-
603
- - Clients will respond to all incoming sample requests using the specified model (`gpt-4.1`)
604
- - Sample messages will only be approved if they contain the word "Hello" (when using the guard)
605
- - The `preferred_model` can be a string or a proc that provides dynamic model selection based on MCP server characteristics
606
-
607
- The `preferred_model` proc receives model preferences from the MCP server, allowing you to make intelligent model selection decisions based on the server's requirements for success.
608
-
609
- ## Transport Types
610
-
611
- ### SSE (Server-Sent Events)
612
-
613
- Best for web-based MCP servers or when you need HTTP-based communication:
614
-
615
- ```ruby
616
- client = RubyLLM::MCP.client(
617
- name: "web-mcp-server",
618
- transport_type: :sse,
619
- config: {
620
- url: "https://your-mcp-server.com/mcp/sse",
621
- headers: { "Authorization" => "Bearer your-token" }
622
- }
623
- )
624
- ```
625
-
626
- ### Streamable HTTP
627
-
628
- Best for HTTP-based MCP servers that support streaming responses:
629
-
630
- ```ruby
631
- client = RubyLLM::MCP.client(
632
- name: "streaming-mcp-server",
633
- transport_type: :streamable,
634
- config: {
635
- url: "https://your-mcp-server.com/mcp",
636
- headers: { "Authorization" => "Bearer your-token" }
637
- }
638
- )
639
- ```
640
-
641
- ### Stdio
642
-
643
- Best for local MCP servers or command-line tools:
644
-
645
- ```ruby
646
- client = RubyLLM::MCP.client(
647
- name: "local-mcp-server",
648
- transport_type: :stdio,
649
- config: {
650
- command: "python",
651
- args: ["-m", "my_mcp_server"],
652
- env: { "DEBUG" => "1" }
653
- }
654
- )
655
- ```
656
-
657
- ## Creating Custom Transports
658
-
659
- Part of the MCP specification outlines that custom transports can be used for some MCP servers. Out of the box, RubyLLM::MCP supports Streamable HTTP transports, STDIO and the legacy SSE transport.
660
-
661
- You can create custom transport implementations to support additional communication protocols or specialized connection methods.
662
-
663
- ### Transport Registration
664
-
665
- Register your custom transport with the transport factory:
666
-
667
- ```ruby
668
- # Define your custom transport class
669
- class MyCustomTransport
670
- # Implementation details...
671
- end
672
-
673
- # Register it with the factory
674
- RubyLLM::MCP::Transport.register_transport(:my_custom, MyCustomTransport)
675
-
676
- # Now you can use it
677
- client = RubyLLM::MCP.client(
678
- name: "custom-server",
679
- transport_type: :my_custom,
680
- config: {
681
- # Your custom configuration
682
- }
683
- )
684
- ```
685
-
686
- ### Required Interface
687
-
688
- All transport implementations must implement the following interface:
689
-
690
- ```ruby
691
- class MyCustomTransport
692
- # Initialize the transport
693
- def initialize(coordinator:, **config)
694
- @coordinator = coordinator # Uses for communication between the client and the MCP server
695
- @config = config # Transport-specific configuration
696
- end
697
-
698
- # Send a request and optionally wait for response
699
- # Returns a RubyLLM::MCP::Result object
700
- # body: the request body
701
- # add_id: true will add an id to the request
702
- # wait_for_response: true will wait for a response from the MCP server
703
- # Returns a RubyLLM::MCP::Result object
704
- def request(body, add_id: true, wait_for_response: true)
705
- # Implementation: send request and return result
706
- data = some_method_to_send_request_and_get_result(body)
707
- # Use Result object to make working with the protocol easier
708
- result = RubyLLM::MCP::Result.new(data)
709
-
710
- # Call the coordinator to process the result
711
- @coordinator.process_result(result)
712
- return if result.nil? # Some results are related to notifications and should not be returned to the client, but processed by the coordinator instead
713
-
714
- # Return the result
715
- result
716
- end
717
-
718
- # Check if transport is alive/connected
719
- def alive?
720
- # Implementation: return true if connected
721
- end
722
-
723
- # Start the transport connection
724
- def start
725
- # Implementation: establish connection
726
- end
727
-
728
- # Close the transport connection
729
- def close
730
- # Implementation: cleanup and close connection
731
- end
732
-
733
- # Set the MCP protocol version, used in some transports to identify the agreed upon protocol version
734
- def set_protocol_version(version)
735
- @protocol_version = version
736
- end
737
- end
738
- ```
739
-
740
- ### The Result Object
741
-
742
- The `RubyLLM::MCP::Result` class wraps MCP responses and provides convenient methods:
743
-
744
- ```ruby
745
- result = transport.request(body)
746
-
747
- # Core properties
748
- result.id # Request ID
749
- result.method # Request method
750
- result.result # Result data (hash)
751
- result.params # Request parameters
752
- result.error # Error data (hash)
753
- result.session_id # Session ID (if applicable)
754
-
755
- # Type checking
756
- result.success? # Has result data
757
- result.error? # Has error data
758
- result.notification? # Is a notification
759
- result.request? # Is a request
760
- result.response? # Is a response
761
-
762
- # Specialized methods
763
- result.tool_success? # Successful tool execution
764
- result.execution_error? # Tool execution failed
765
- result.matching_id?(id) # Matches request ID
766
- result.next_cursor? # Has pagination cursor
767
-
768
- # Error handling
769
- result.raise_error! # Raise exception if error
770
- result.to_error # Convert to Error object
771
-
772
- # Notifications
773
- result.notification # Get notification object
774
- ```
775
-
776
- ### Error Handling
777
-
778
- Custom transports should handle errors appropriately. If request fails, you should raise a `RubyLLM::MCP::Errors::TransportError` exception. If the request times out, you should raise a `RubyLLM::MCP::Errors::TimeoutError` exception. This will ensure that a cancellation notification is sent to the MCP server correctly.
779
-
780
- ```ruby
781
- def request(body, add_id: true, wait_for_response: true)
782
- begin
783
- # Send request
784
- send_request(body)
785
- rescue SomeConnectionError => e
786
- # Convert to MCP transport error
787
- raise RubyLLM::MCP::Errors::TransportError.new(
788
- message: "Connection failed: #{e.message}",
789
- error: e
790
- )
791
- rescue Timeout::Error => e
792
- # Convert to MCP timeout error
793
- raise RubyLLM::MCP::Errors::TimeoutError.new(
794
- message: "Request timeout after #{@request_timeout}ms",
795
- request_id: body["id"]
796
- )
797
- end
798
- end
799
- ```
800
-
801
- ## RubyLLM::MCP and Client Configuration Options
802
-
803
- MCP comes with some common configuration options that can be set on the client.
804
-
805
- ```ruby
806
- RubyLLM::MCP.configure do |config|
807
- # Set the progress handler
808
- config.support_complex_parameters!
809
-
810
- # Set parameters on the built in logger
811
- config.log_file = $stdout
812
- config.log_level = Logger::ERROR
813
-
814
- # Or add a custom logger
815
- config.logger = Logger.new(STDOUT)
816
- end
817
- ```
818
-
819
- ### MCP Client Options
820
-
821
- MCP client options are set on the client itself.
822
-
823
- - `name`: A unique identifier for your MCP client
824
- - `transport_type`: Either `:sse`, `:streamable`, or `:stdio`
825
- - `start`: Whether to automatically start the connection (default: true)
826
- - `request_timeout`: Timeout for requests in milliseconds (default: 8000)
827
- - `config`: Transport-specific configuration
828
- - For SSE: `{ url: "http://...", headers: {...} }`
829
- - For Streamable: `{ url: "http://...", headers: {...} }`
830
- - For stdio: `{ command: "...", args: [...], env: {...} }`
831
-
832
- ## Development
833
-
834
- After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
835
-
836
- To install this gem onto your local machine, run `bundle exec rake install`. Run `bundle exec rake` to test specs and run linters. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and the created tag, and push the `.gem` file to [rubygems.org](https://rubygems.org).
837
-
838
- ## Examples
839
-
840
- Check out the `examples/` directory for more detailed usage examples:
841
-
842
- - `examples/tools/local_mcp.rb` - Complete example with stdio transport
843
- - `examples/tools/sse_mcp_with_gpt.rb` - Example using SSE transport with GPT
844
- - `examples/resources/list_resources.rb` - Example of listing and using resources
845
- - `examples/prompts/streamable_prompt_call.rb` - Example of using prompts with streamable transport
846
-
847
230
  ## Contributing
848
231
 
849
232
  We welcome contributions! Bug reports and pull requests are welcome on GitHub at https://github.com/patvice/ruby_llm-mcp.