fastmcp 0.4.1__py3-none-any.whl → 2.0.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,770 @@
1
+ Metadata-Version: 2.4
2
+ Name: fastmcp
3
+ Version: 2.0.0
4
+ Summary: An ergonomic MCP interface
5
+ Author: Jeremiah Lowin
6
+ License: Apache-2.0
7
+ License-File: LICENSE
8
+ Requires-Python: >=3.10
9
+ Requires-Dist: dotenv>=0.9.9
10
+ Requires-Dist: fastapi>=0.115.12
11
+ Requires-Dist: mcp<2.0.0,>=1.6.0
12
+ Requires-Dist: openapi-pydantic>=0.5.1
13
+ Requires-Dist: rich>=13.9.4
14
+ Requires-Dist: typer>=0.15.2
15
+ Requires-Dist: websockets>=15.0.1
16
+ Description-Content-Type: text/markdown
17
+
18
+ <div align="center">
19
+
20
+ <!-- omit in toc -->
21
+ # FastMCP v2 🚀
22
+ <strong>The fast, Pythonic way to build MCP servers.</strong>
23
+
24
+ [![PyPI - Version](https://img.shields.io/pypi/v/fastmcp.svg)](https://pypi.org/project/fastmcp)
25
+ [![Tests](https://github.com/jlowin/fastmcp/actions/workflows/run-tests.yml/badge.svg)](https://github.com/jlowin/fastmcp/actions/workflows/run-tests.yml)
26
+ [![License](https://img.shields.io/github/license/jlowin/fastmcp.svg)](https://github.com/jlowin/fastmcp/blob/main/LICENSE)
27
+
28
+ </div>
29
+
30
+ [Model Context Protocol (MCP)](https://modelcontextprotocol.io) servers are a standardized way to provide context and tools to your LLMs, and FastMCP makes building *and interacting with* them simple and intuitive. Create tools, expose resources, define prompts, and connect components with clean, Pythonic code.
31
+
32
+ ```python
33
+ # server.py
34
+ from fastmcp import FastMCP
35
+
36
+ mcp = FastMCP("Demo 🚀")
37
+
38
+ @mcp.tool()
39
+ def add(a: int, b: int) -> int:
40
+ """Add two numbers"""
41
+ return a + b
42
+
43
+ if __name__ == "__main__":
44
+ mcp.run()
45
+ ```
46
+
47
+ Run it locally for testing:
48
+ ```bash
49
+ fastmcp dev server.py
50
+ ```
51
+
52
+ Install it for use with Claude Desktop:
53
+ ```bash
54
+ fastmcp install server.py
55
+ ```
56
+
57
+ FastMCP handles the complex protocol details and server management, letting you focus on building great tools and applications. It's designed to feel natural to Python developers.
58
+
59
+ ## Key Features:
60
+
61
+ * **Simple Server Creation:** Build MCP servers with minimal boilerplate using intuitive decorators (`@tool`, `@resource`, `@prompt`).
62
+ * **Proxy MCP Servers:** Create proxy servers to expose existing MCP servers or clients with modifications, or convert between transport protocols (e.g., expose a Stdio server via SSE for web access).
63
+ * **Compose MCP Servers:** Compose complex applications by mounting multiple FastMCP servers together.
64
+ * **API Generation:** Automatically create MCP servers from existing **OpenAPI specifications** or **FastAPI applications**.
65
+ * **Powerful Clients:** Programmatically interact with *any* MCP server, regardless of how it was built.
66
+ * **LLM Sampling:** Request completions from client LLMs directly within your MCP tools.
67
+ * **Pythonic Interface:** Designed with familiar Python patterns like decorators and type hints.
68
+ * **Context Injection:** Easily access core MCP capabilities like sampling, logging, and progress reporting within your functions.
69
+
70
+ ---
71
+
72
+ ### What's New in v2?
73
+
74
+ FastMCP 1.0 made it so easy to build MCP servers that it's now part of the [official Model Context Protocol Python SDK](https://github.com/modelcontextprotocol/python-sdk)! For basic use cases, you can use the upstream version by importing `mcp.server.fastmcp.FastMCP` (or installing `fastmcp=1.0`).
75
+
76
+ Based on how the MCP ecosystem is evolving, FastMCP 2.0 builds on that foundation to introduce a variety of new features (and more experimental ideas). It adds advanced features like proxying and composing MCP servers, as well as automatically generating them from OpenAPI specs or FastAPI objects. FastMCP 2.0 also introduces new client-side functionality like LLM sampling.
77
+
78
+
79
+ ---
80
+
81
+ <!-- omit in toc -->
82
+ ## Table of Contents
83
+
84
+ - [Key Features:](#key-features)
85
+ - [What's New in v2?](#whats-new-in-v2)
86
+ - [Installation](#installation)
87
+ - [Quickstart](#quickstart)
88
+ - [What is MCP?](#what-is-mcp)
89
+ - [Core Concepts](#core-concepts)
90
+ - [The `FastMCP` Server](#the-fastmcp-server)
91
+ - [Tools](#tools)
92
+ - [Resources](#resources)
93
+ - [Prompts](#prompts)
94
+ - [Context](#context)
95
+ - [Images](#images)
96
+ - [MCP Clients](#mcp-clients)
97
+ - [Client Methods](#client-methods)
98
+ - [Transport Options](#transport-options)
99
+ - [LLM Sampling](#llm-sampling)
100
+ - [Roots Access](#roots-access)
101
+ - [Advanced Features](#advanced-features)
102
+ - [Proxy Servers](#proxy-servers)
103
+ - [Composing MCP Servers](#composing-mcp-servers)
104
+ - [OpenAPI \& FastAPI Generation](#openapi--fastapi-generation)
105
+ - [Running Your Server](#running-your-server)
106
+ - [Development Mode (Recommended for Building \& Testing)](#development-mode-recommended-for-building--testing)
107
+ - [Claude Desktop Integration (For Regular Use)](#claude-desktop-integration-for-regular-use)
108
+ - [Direct Execution (For Advanced Use Cases)](#direct-execution-for-advanced-use-cases)
109
+ - [Server Object Names](#server-object-names)
110
+ - [Examples](#examples)
111
+ - [Contributing](#contributing)
112
+ - [Prerequisites](#prerequisites)
113
+ - [Setup](#setup)
114
+ - [Testing](#testing)
115
+ - [Formatting \& Linting](#formatting--linting)
116
+ - [Pull Requests](#pull-requests)
117
+
118
+ ## Installation
119
+
120
+ We strongly recommend installing FastMCP with [uv](https://docs.astral.sh/uv/), as it is required for deploying servers via the CLI:
121
+
122
+ ```bash
123
+ uv pip install fastmcp
124
+ ```
125
+
126
+ Note: on macOS, uv may need to be installed with Homebrew (`brew install uv`) in order to make it available to the Claude Desktop app.
127
+
128
+ For development, install with:
129
+ ```bash
130
+ # Clone the repo first
131
+ git clone https://github.com/jlowin/fastmcp.git
132
+ cd fastmcp
133
+ # Install with dev dependencies
134
+ uv sync
135
+ ```
136
+
137
+ ## Quickstart
138
+
139
+ Let's create a simple MCP server that exposes a calculator tool and some data:
140
+
141
+ ```python
142
+ # server.py
143
+ from fastmcp import FastMCP
144
+
145
+ # Create an MCP server
146
+ mcp = FastMCP("Demo")
147
+
148
+ # Add an addition tool
149
+ @mcp.tool()
150
+ def add(a: int, b: int) -> int:
151
+ """Add two numbers"""
152
+ return a + b
153
+
154
+ # Add a dynamic greeting resource
155
+ @mcp.resource("greeting://{name}")
156
+ def get_greeting(name: str) -> str:
157
+ """Get a personalized greeting"""
158
+ return f"Hello, {name}!"
159
+ ```
160
+
161
+ You can install this server in [Claude Desktop](https://claude.ai/download) and interact with it right away by running:
162
+ ```bash
163
+ fastmcp install server.py
164
+ ```
165
+
166
+ Alternatively, you can test it with the MCP Inspector:
167
+ ```bash
168
+ fastmcp dev server.py
169
+ ```
170
+
171
+ ![MCP Inspector](/docs/assets/demo-inspector.png)
172
+
173
+ ## What is MCP?
174
+
175
+ The [Model Context Protocol (MCP)](https://modelcontextprotocol.io) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:
176
+
177
+ - Expose data through **Resources** (think GET endpoints; load info into context)
178
+ - Provide functionality through **Tools** (think POST/PUT endpoints; execute actions)
179
+ - Define interaction patterns through **Prompts** (reusable templates)
180
+ - And more!
181
+
182
+ FastMCP provides a high-level, Pythonic interface for building and interacting with these servers.
183
+
184
+ ## Core Concepts
185
+
186
+ These are the building blocks for creating MCP servers, using the familiar decorator-based approach.
187
+
188
+ ### The `FastMCP` Server
189
+
190
+ The central object representing your MCP application. It handles connections, protocol details, and routing.
191
+
192
+ ```python
193
+ from fastmcp import FastMCP
194
+
195
+ # Create a named server
196
+ mcp = FastMCP("My App")
197
+
198
+ # Specify dependencies needed when deployed via `fastmcp install`
199
+ mcp = FastMCP("My App", dependencies=["pandas", "numpy"])
200
+ ```
201
+
202
+ ### Tools
203
+
204
+ Tools allow LLMs to perform actions by executing your Python functions. They are ideal for tasks that involve computation, external API calls, or side effects.
205
+
206
+ Decorate synchronous or asynchronous functions with `@mcp.tool()`. FastMCP automatically generates the necessary MCP schema based on type hints and docstrings. Pydantic models can be used for complex inputs.
207
+
208
+ ```python
209
+ import httpx
210
+ from pydantic import BaseModel
211
+
212
+ class UserInfo(BaseModel):
213
+ user_id: int
214
+ notify: bool = False
215
+
216
+ @mcp.tool()
217
+ async def send_notification(user: UserInfo, message: str) -> dict:
218
+ """Sends a notification to a user if requested."""
219
+ if user.notify:
220
+ # Simulate sending notification
221
+ print(f"Notifying user {user.user_id}: {message}")
222
+ return {"status": "sent", "user_id": user.user_id}
223
+ return {"status": "skipped", "user_id": user.user_id}
224
+
225
+ @mcp.tool()
226
+ def get_stock_price(ticker: str) -> float:
227
+ """Gets the current price for a stock ticker."""
228
+ # Replace with actual API call
229
+ prices = {"AAPL": 180.50, "GOOG": 140.20}
230
+ return prices.get(ticker.upper(), 0.0)
231
+ ```
232
+
233
+ ### Resources
234
+
235
+ Resources expose data to LLMs. They should primarily provide information without significant computation or side effects (like GET requests).
236
+
237
+ Decorate functions with `@mcp.resource("your://uri")`. Use curly braces `{}` in the URI to define dynamic resources (templates) where parts of the URI become function parameters.
238
+
239
+ ```python
240
+ # Static resource returning simple text
241
+ @mcp.resource("config://app-version")
242
+ def get_app_version() -> str:
243
+ """Returns the application version."""
244
+ return "v2.1.0"
245
+
246
+ # Dynamic resource template expecting a 'user_id' from the URI
247
+ @mcp.resource("db://users/{user_id}/email")
248
+ async def get_user_email(user_id: str) -> str:
249
+ """Retrieves the email address for a given user ID."""
250
+ # Replace with actual database lookup
251
+ emails = {"123": "alice@example.com", "456": "bob@example.com"}
252
+ return emails.get(user_id, "not_found@example.com")
253
+
254
+ # Resource returning JSON data
255
+ @mcp.resource("data://product-categories")
256
+ def get_categories() -> list[str]:
257
+ """Returns a list of available product categories."""
258
+ return ["Electronics", "Books", "Home Goods"]
259
+ ```
260
+
261
+ ### Prompts
262
+
263
+ Prompts define reusable templates or interaction patterns for the LLM. They help guide the LLM on how to use your server's capabilities effectively.
264
+
265
+ Decorate functions with `@mcp.prompt()`. The function should return the desired prompt content, which can be a simple string, a `Message` object (like `UserMessage` or `AssistantMessage`), or a list of these.
266
+
267
+ ```python
268
+ from fastmcp.prompts.base import UserMessage, AssistantMessage
269
+
270
+ @mcp.prompt()
271
+ def ask_review(code_snippet: str) -> str:
272
+ """Generates a standard code review request."""
273
+ return f"Please review the following code snippet for potential bugs and style issues:\n```python\n{code_snippet}\n```"
274
+
275
+ @mcp.prompt()
276
+ def debug_session_start(error_message: str) -> list[Message]:
277
+ """Initiates a debugging help session."""
278
+ return [
279
+ UserMessage(f"I encountered an error:\n{error_message}"),
280
+ AssistantMessage("Okay, I can help with that. Can you provide the full traceback and tell me what you were trying to do?")
281
+ ]
282
+ ```
283
+
284
+ ### Context
285
+
286
+ Gain access to MCP server capabilities *within* your tool or resource functions by adding a parameter type-hinted with `fastmcp.Context`.
287
+
288
+ ```python
289
+ from fastmcp import Context, FastMCP
290
+
291
+ mcp = FastMCP("Context Demo")
292
+
293
+ @mcp.resource("system://status")
294
+ async def get_system_status(ctx: Context) -> dict:
295
+ """Checks system status and logs information."""
296
+ await ctx.info("Checking system status...")
297
+ # Perform checks
298
+ await ctx.report_progress(1, 1) # Report completion
299
+ return {"status": "OK", "load": 0.5, "client": ctx.client_id}
300
+
301
+ @mcp.tool()
302
+ async def process_large_file(file_uri: str, ctx: Context) -> str:
303
+ """Processes a large file, reporting progress and reading resources."""
304
+ await ctx.info(f"Starting processing for {file_uri}")
305
+ # Read the resource using the context
306
+ file_content_resource = await ctx.read_resource(file_uri)
307
+ file_content = file_content_resource[0].content # Assuming single text content
308
+ lines = file_content.splitlines()
309
+ total_lines = len(lines)
310
+
311
+ for i, line in enumerate(lines):
312
+ # Process line...
313
+ if (i + 1) % 100 == 0: # Report progress every 100 lines
314
+ await ctx.report_progress(i + 1, total_lines)
315
+
316
+ await ctx.info(f"Finished processing {file_uri}")
317
+ return f"Processed {total_lines} lines."
318
+
319
+ ```
320
+
321
+ The `Context` object provides:
322
+ * Logging: `ctx.debug()`, `ctx.info()`, `ctx.warning()`, `ctx.error()`
323
+ * Progress Reporting: `ctx.report_progress(current, total)`
324
+ * Resource Access: `await ctx.read_resource(uri)`
325
+ * Request Info: `ctx.request_id`, `ctx.client_id`
326
+ * Sampling (Advanced): `await ctx.sample(...)` to ask the connected LLM client for completions.
327
+
328
+ ### Images
329
+
330
+ Easily handle image input and output using the `fastmcp.Image` helper class.
331
+
332
+ ```python
333
+ from fastmcp import FastMCP, Image
334
+ from PIL import Image as PILImage
335
+ import io
336
+
337
+ mcp = FastMCP("Image Demo")
338
+
339
+ @mcp.tool()
340
+ def create_thumbnail(image_data: Image) -> Image:
341
+ """Creates a 100x100 thumbnail from the provided image."""
342
+ img = PILImage.open(io.BytesIO(image_data.data)) # Assumes image_data received as Image with bytes
343
+ img.thumbnail((100, 100))
344
+ buffer = io.BytesIO()
345
+ img.save(buffer, format="PNG")
346
+ # Return a new Image object with the thumbnail data
347
+ return Image(data=buffer.getvalue(), format="png")
348
+
349
+ @mcp.tool()
350
+ def load_image_from_disk(path: str) -> Image:
351
+ """Loads an image from the specified path."""
352
+ # Handles reading file and detecting format based on extension
353
+ return Image(path=path)
354
+ ```
355
+ FastMCP handles the conversion to/from the base64-encoded format required by the MCP protocol.
356
+
357
+
358
+ ### MCP Clients
359
+
360
+ The `Client` class lets you interact with any MCP server (not just FastMCP ones) from Python code:
361
+
362
+ ```python
363
+ from fastmcp import Client
364
+
365
+ async with Client("path/to/server") as client:
366
+ # Call a tool
367
+ result = await client.call_tool("weather", {"location": "San Francisco"})
368
+ print(result)
369
+
370
+ # Read a resource
371
+ res = await client.read_resource("db://users/123/profile")
372
+ print(res)
373
+ ```
374
+
375
+ You can connect to servers using any supported transport protocol (Stdio, SSE, FastMCP, etc.). If you don't specify a transport, the `Client` class automatically attempts to detect an appropriate one from your connection string or server object.
376
+
377
+ #### Client Methods
378
+
379
+ The `Client` class exposes several methods for interacting with MCP servers.
380
+
381
+ ```python
382
+ async with Client("path/to/server") as client:
383
+ # List available tools
384
+ tools = await client.list_tools()
385
+
386
+ # List available resources
387
+ resources = await client.list_resources()
388
+
389
+ # Call a tool with arguments
390
+ result = await client.call_tool("generate_report", {"user_id": 123})
391
+
392
+ # Read a resource
393
+ user_data = await client.read_resource("db://users/123/profile")
394
+
395
+ # Get a prompt
396
+ greeting = await client.get_prompt("welcome", {"name": "Alice"})
397
+
398
+ # Send progress updates
399
+ await client.progress("task-123", 50, 100) # 50% complete
400
+
401
+ # Basic connectivity testing
402
+ await client.ping()
403
+ ```
404
+
405
+ These methods correspond directly to MCP protocol operations, making it easy to interact with any MCP-compatible server (not just FastMCP ones).
406
+
407
+ #### Transport Options
408
+
409
+ FastMCP supports various transport protocols for connecting to MCP servers:
410
+
411
+ ```python
412
+ from fastmcp import Client
413
+ from fastmcp.client.transports import (
414
+ SSETransport,
415
+ PythonStdioTransport,
416
+ FastMCPTransport
417
+ )
418
+
419
+ # Connect to a server over SSE (common for web-based MCP servers)
420
+ async with Client(SSETransport("http://localhost:8000/mcp")) as client:
421
+ # Use client here...
422
+
423
+ # Connect to a Python script using stdio (useful for local tools)
424
+ async with Client(PythonStdioTransport("path/to/script.py")) as client:
425
+ # Use client here...
426
+
427
+ # Connect directly to a FastMCP server object in the same process
428
+ from your_app import mcp_server
429
+ async with Client(FastMCPTransport(mcp_server)) as client:
430
+ # Use client here...
431
+ ```
432
+
433
+ Common transport options include:
434
+ - `SSETransport`: Connect to a server via Server-Sent Events (HTTP)
435
+ - `PythonStdioTransport`: Run a Python script and communicate via stdio
436
+ - `FastMCPTransport`: Connect directly to a FastMCP server object
437
+ - `WSTransport`: Connect via WebSockets
438
+
439
+ In addition, if you pass a connection string or `FastMCP` server object to the `Client` constructor, it will try to automatically detect the appropriate transport.
440
+
441
+ #### LLM Sampling
442
+
443
+ Sampling is an MCP feature that allows a server to request a completion from the client LLM, enabling sophisticated use cases while maintaining security and privacy on the server.
444
+
445
+ ```python
446
+ import marvin # Or any other LLM client
447
+ from fastmcp import Client, Context, FastMCP
448
+ from fastmcp.client.sampling import RequestContext, SamplingMessage, SamplingParams
449
+
450
+ # -- SERVER SIDE --
451
+ # Create a server that requests LLM completions from the client
452
+
453
+ mcp = FastMCP("Sampling Example")
454
+
455
+ @mcp.tool()
456
+ async def generate_poem(topic: str, context: Context) -> str:
457
+ """Generate a short poem about the given topic."""
458
+ # The server requests a completion from the client LLM
459
+ response = await context.sample(
460
+ f"Write a short poem about {topic}",
461
+ system_prompt="You are a talented poet who writes concise, evocative verses."
462
+ )
463
+ return response.text
464
+
465
+ @mcp.tool()
466
+ async def summarize_document(document_uri: str, context: Context) -> str:
467
+ """Summarize a document using client-side LLM capabilities."""
468
+ # First read the document as a resource
469
+ doc_resource = await context.read_resource(document_uri)
470
+ doc_content = doc_resource[0].content # Assuming single text content
471
+
472
+ # Then ask the client LLM to summarize it
473
+ response = await context.sample(
474
+ f"Summarize the following document:\n\n{doc_content}",
475
+ system_prompt="You are an expert summarizer. Create a concise summary."
476
+ )
477
+ return response.text
478
+
479
+ # -- CLIENT SIDE --
480
+ # Create a client that handles the sampling requests
481
+
482
+ async def sampling_handler(
483
+ messages: list[SamplingMessage],
484
+ params: SamplingParams,
485
+ ctx: RequestContext,
486
+ ) -> str:
487
+ """Handle sampling requests from the server using your preferred LLM."""
488
+ # Extract the messages and system prompt
489
+ prompt = [m.content.text for m in messages if m.content.type == "text"]
490
+ system_instruction = params.systemPrompt or "You are a helpful assistant."
491
+
492
+ # Use your preferred LLM client to generate completions
493
+ return await marvin.say_async(
494
+ message=prompt,
495
+ instructions=system_instruction,
496
+ )
497
+
498
+ # Connect them together
499
+ async with Client(mcp, sampling_handler=sampling_handler) as client:
500
+ result = await client.call_tool("generate_poem", {"topic": "autumn leaves"})
501
+ print(result.content[0].text)
502
+ ```
503
+
504
+ This pattern is powerful because:
505
+ 1. The server can delegate text generation to the client LLM
506
+ 2. The server remains focused on business logic and data handling
507
+ 3. The client maintains control over which LLM is used and how requests are handled
508
+ 4. No sensitive data needs to be sent to external APIs
509
+
510
+ #### Roots Access
511
+
512
+ FastMCP exposes the MCP roots functionality, allowing clients to specify which file system roots they can access. This creates a secure boundary for tools that need to work with files. Note that the server must account for client roots explicitly.
513
+
514
+ ```python
515
+ from fastmcp import Client, RootsList
516
+
517
+ # Specify file roots that the client can access
518
+ roots = ["file:///path/to/allowed/directory"]
519
+
520
+ async with Client(mcp_server, roots=roots) as client:
521
+ # Now tools in the MCP server can access files in the specified roots
522
+ await client.call_tool("process_file", {"filename": "data.csv"})
523
+ ```
524
+
525
+ ## Advanced Features
526
+
527
+ Building on the core concepts, FastMCP v2 introduces powerful features for more complex scenarios:
528
+
529
+
530
+ ### Proxy Servers
531
+
532
+ Create a FastMCP server that acts as an intermediary, proxying requests to another MCP endpoint (which could be a server or another client connection).
533
+
534
+ **Use Cases:**
535
+
536
+ * **Transport Conversion:** Expose a server running on Stdio (like many local tools) over SSE or WebSockets, making it accessible to web clients or Claude Desktop.
537
+ * **Adding Functionality:** Wrap an existing server to add authentication, request logging, or modified tool behavior.
538
+ * **Aggregating Servers:** Combine multiple backend MCP servers behind a single proxy interface (though `mount` might be simpler for this).
539
+
540
+ ```python
541
+ import asyncio
542
+ from fastmcp import FastMCP, Client
543
+ from fastmcp.client.transports import PythonStdioTransport
544
+
545
+ # Create a client that connects to the original server
546
+ proxy_client = Client(
547
+ transport=PythonStdioTransport('path/to/original_stdio_server.py'),
548
+ )
549
+
550
+ # Create a proxy server that connects to the client and exposes its capabilities
551
+ proxy = FastMCP.as_proxy(proxy_client, name="Stdio-to-SSE Proxy")
552
+
553
+ if __name__ == "__main__":
554
+ proxy.run(transport='sse')
555
+ ```
556
+
557
+ `FastMCP.as_proxy` is an `async` classmethod. It connects to the target, discovers its capabilities, and dynamically builds the proxy server instance.
558
+
559
+
560
+
561
+ ### Composing MCP Servers
562
+
563
+ Structure larger MCP applications by creating modular FastMCP servers and "mounting" them onto a parent server. This automatically handles prefixing for tool names and resource URIs, preventing conflicts.
564
+
565
+ ```python
566
+ from fastmcp import FastMCP
567
+
568
+ # --- Weather MCP ---
569
+ weather_mcp = FastMCP("Weather Service")
570
+
571
+ @weather_mcp.tool()
572
+ def get_forecast(city: str):
573
+ return f"Sunny in {city}"
574
+
575
+ @weather_mcp.resource("data://temp/{city}")
576
+ def get_temp(city: str):
577
+ return 25.0
578
+
579
+ # --- News MCP ---
580
+ news_mcp = FastMCP("News Service")
581
+
582
+ @news_mcp.tool()
583
+ def fetch_headlines():
584
+ return ["Big news!", "Other news"]
585
+
586
+ @news_mcp.resource("data://latest_story")
587
+ def get_story():
588
+ return "A story happened."
589
+
590
+ # --- Composite MCP ---
591
+
592
+ mcp = FastMCP("Composite")
593
+
594
+ # Mount sub-apps with prefixes
595
+ mcp.mount("weather", weather_mcp) # Tools prefixed "weather/", resources prefixed "weather+"
596
+ mcp.mount("news", news_mcp) # Tools prefixed "news/", resources prefixed "news+"
597
+
598
+ @mcp.tool()
599
+ def ping():
600
+ return "Composite OK"
601
+
602
+
603
+ if __name__ == "__main__":
604
+ mcp.run()
605
+ ```
606
+
607
+ This promotes code organization and reusability for complex MCP systems.
608
+
609
+ ### OpenAPI & FastAPI Generation
610
+
611
+ Leverage your existing web APIs by automatically generating FastMCP servers from them.
612
+
613
+ By default, the following rules are applied:
614
+ - `GET` requests -> MCP resources
615
+ - `GET` requests with path parameters -> MCP resource templates
616
+ - All other HTTP methods -> MCP tools
617
+
618
+ You can override these rules to customize or even ignore certain endpoints.
619
+
620
+ **From FastAPI:**
621
+
622
+ ```python
623
+ from fastapi import FastAPI
624
+ from fastmcp import FastMCP
625
+
626
+ # Your existing FastAPI application
627
+ fastapi_app = FastAPI(title="My Existing API")
628
+
629
+ @fastapi_app.get("/status")
630
+ def get_status():
631
+ return {"status": "running"}
632
+
633
+ @fastapi_app.post("/items")
634
+ def create_item(name: str, price: float):
635
+ return {"id": 1, "name": name, "price": price}
636
+
637
+ # Generate an MCP server directly from the FastAPI app
638
+ mcp_server = FastMCP.from_fastapi(fastapi_app)
639
+
640
+ if __name__ == "__main__":
641
+ mcp_server.run()
642
+ ```
643
+
644
+ **From an OpenAPI Specification:**
645
+
646
+ ```python
647
+ import httpx
648
+ import json
649
+ from fastmcp import FastMCP
650
+
651
+ # Load the OpenAPI spec (dict)
652
+ # with open("my_api_spec.json", "r") as f:
653
+ # openapi_spec = json.load(f)
654
+ openapi_spec = { ... } # Your spec dict
655
+
656
+ # Create an HTTP client to make requests to the actual API endpoint
657
+ http_client = httpx.AsyncClient(base_url="https://api.yourservice.com")
658
+
659
+ # Generate the MCP server
660
+ mcp_server = FastMCP.from_openapi(openapi_spec, client=http_client)
661
+
662
+ if __name__ == "__main__":
663
+ mcp_server.run()
664
+ ```
665
+ ## Running Your Server
666
+
667
+ Choose the method that best suits your needs:
668
+
669
+ ### Development Mode (Recommended for Building & Testing)
670
+
671
+ Use `fastmcp dev` for an interactive testing environment with the MCP Inspector.
672
+
673
+ ```bash
674
+ fastmcp dev your_server_file.py
675
+ # With temporary dependencies
676
+ fastmcp dev your_server_file.py --with pandas --with numpy
677
+ # With local package in editable mode
678
+ fastmcp dev your_server_file.py --with-editable .
679
+ ```
680
+
681
+ ### Claude Desktop Integration (For Regular Use)
682
+
683
+ Use `fastmcp install` to set up your server for persistent use within the Claude Desktop app. It handles creating an isolated environment using `uv`.
684
+
685
+ ```bash
686
+ fastmcp install your_server_file.py
687
+ # With a custom name in Claude
688
+ fastmcp install your_server_file.py --name "My Analysis Tool"
689
+ # With extra packages and environment variables
690
+ fastmcp install server.py --with requests -v API_KEY=123 -f .env
691
+ ```
692
+
693
+ ### Direct Execution (For Advanced Use Cases)
694
+
695
+ Run your server script directly for custom deployments or integrations outside of Claude. You manage the environment and dependencies yourself.
696
+
697
+ Add to your `your_server_file.py`:
698
+ ```python
699
+ if __name__ == "__main__":
700
+ mcp.run() # Assuming 'mcp' is your FastMCP instance
701
+ ```
702
+ Run with:
703
+ ```bash
704
+ python your_server_file.py
705
+ # or
706
+ uv run python your_server_file.py
707
+ ```
708
+
709
+ ### Server Object Names
710
+
711
+ If your `FastMCP` instance is not named `mcp`, `server`, or `app`, specify it using `file:object` syntax for the `dev` and `install` commands:
712
+
713
+ ```bash
714
+ fastmcp dev my_module.py:my_mcp_instance
715
+ fastmcp install api.py:api_app
716
+ ```
717
+
718
+ ## Examples
719
+
720
+ Explore the `examples/` directory for code samples demonstrating various features:
721
+
722
+ * `simple_echo.py`: Basic tool, resource, and prompt.
723
+ * `complex_inputs.py`: Using Pydantic models for tool inputs.
724
+ * `mount_example.py`: Mounting multiple FastMCP servers.
725
+ * `sampling.py`: Using LLM completions within your MCP server.
726
+ * `screenshot.py`: Tool returning an Image object.
727
+ * `text_me.py`: Tool interacting with an external API.
728
+ * `memory.py`: More complex example with database interaction.
729
+
730
+ ## Contributing
731
+
732
+ Contributions make the open-source community vibrant! We welcome improvements and features.
733
+
734
+ <details>
735
+
736
+ <summary><h3>Open Developer Guide</h3></summary>
737
+
738
+ #### Prerequisites
739
+
740
+ * Python 3.10+
741
+ * [uv](https://docs.astral.sh/uv/)
742
+
743
+ #### Setup
744
+
745
+ 1. Clone: `git clone https://github.com/jlowin/fastmcp.git && cd fastmcp`
746
+ 2. Install Env & Dependencies: `uv venv && uv sync` (Activate the `.venv` after creation)
747
+
748
+ #### Testing
749
+
750
+ Run the test suite:
751
+ ```bash
752
+ uv run pytest -vv
753
+ ```
754
+
755
+ #### Formatting & Linting
756
+
757
+ We use `ruff` via `pre-commit`.
758
+ 1. Install hooks: `pre-commit install`
759
+ 2. Run checks: `pre-commit run --all-files`
760
+
761
+ #### Pull Requests
762
+
763
+ 1. Fork the repository.
764
+ 2. Create a feature branch.
765
+ 3. Make changes, commit, and push to your fork.
766
+ 4. Open a pull request against the `main` branch of `jlowin/fastmcp`.
767
+
768
+ Please open an issue or discussion for questions or suggestions!
769
+
770
+ </details>