Audience: LLM-driven engineering agents and human developers
FastMCP is a comprehensive Python framework (Python ≥3.10) for building Model Context Protocol (MCP) servers and clients. This is the actively maintained v2.0 providing a complete toolkit for the MCP ecosystem.
CRITICAL: Always run these commands in sequence before committing:
uv sync # Install dependencies
uv run pre-commit run --all-files # Ruff + Prettier + ty
uv run pytest # Run full test suiteAll three must pass - this is enforced by CI. Alternative: just build && just typecheck && just test
Tests must pass and lint/typing must be clean before committing.
| Path | Purpose |
|---|---|
src/fastmcp/ |
Library source code (Python ≥ 3.10) |
├─server/ |
Server implementation, FastMCP, auth, networking |
│ ├─auth/ |
Authentication providers (Bearer, JWT, WorkOS) |
│ └─middleware/ |
Error handling, logging, rate limiting |
├─client/ |
High-level client SDK + transports |
│ └─auth/ |
Client authentication (Bearer, OAuth) |
├─tools/ |
Tool implementations + ToolManager |
├─resources/ |
Resources, templates + ResourceManager |
├─prompts/ |
Prompt templates + PromptManager |
├─cli/ |
FastMCP CLI commands (run, dev, install) |
├─contrib/ |
Community contributions (bulk caller, mixins) |
├─experimental/ |
Experimental features (new OpenAPI parser) |
└─utilities/ |
Shared utilities (logging, JSON schema, HTTP) |
tests/ |
Comprehensive pytest suite with markers |
docs/ |
Mintlify documentation (published to gofastmcp.com) |
examples/ |
Runnable demo servers (echo, smart_home, atproto) |
When modifying MCP functionality, changes typically need to be applied across all object types:
- Tools (
src/tools/+ToolManager) - Resources (
src/resources/+ResourceManager) - Resource Templates (
src/resources/+ResourceManager) - Prompts (
src/prompts/+PromptManager)
- Every test: atomic, self-contained, single functionality
- Use parameterization for multiple examples of same functionality
- Use separate tests for different functionality pieces
- Put imports at the top of the file, not in the test body
- NEVER add
@pytest.mark.asyncioto tests -asyncio_mode = "auto"is set globally - ALWAYS run pytest after significant changes
Pass FastMCP servers directly to clients for testing:
mcp = FastMCP("TestServer")
@mcp.tool
def greet(name: str) -> str:
return f"Hello, {name}!"
# Direct connection - no network complexity
async with Client(mcp) as client:
result = await client.call_tool("greet", {"name": "World"})Only use HTTP transport when explicitly testing network features:
# Network testing only
async with Client(transport=StreamableHttpTransport(server_url)) as client:
result = await client.ping()- Pre-commit hooks are required (run automatically on commits)
- Never amend commits to fix pre-commit failures
- Apply PR labels: bugs/breaking/enhancements/features
- Improvements = enhancements (not features) unless specified
- NEVER force-push on collaborative repos
- ALWAYS run pre-commit before PRs
- Agents NOT acting on behalf of @jlowin MUST identify themselves (e.g., "🤖 Generated with Claude Code" in commits/PRs)
- Keep commit messages brief - ideally just headlines, not detailed messages
- Focus on what changed, not how or why
- Always read issue comments for follow-up information (treat maintainers as authoritative)
- 1-2 paragraphs: problem/tension + solution (PRs are documentation!)
- Focused code example showing key capability
- Avoid: bullet summaries, exhaustive change lists, verbose closes/fixes, marketing language
- Do: Be opinionated about why change matters, show before/after scenarios
- Minor fixes: keep body short and concise
- No "test plan" sections or testing summaries
- Python ≥ 3.10 with full type annotations
- Follow existing patterns and maintain consistency
- Prioritize readable, understandable code - clarity over cleverness
- Avoid obfuscated or confusing patterns even if they're shorter
- Use
# type: ignore[attr-defined]in tests for MCP results instead of type assertions - Each feature needs corresponding tests
- Uses Mintlify framework
- Files must be in docs.json to be included
- Never modify
docs/python-sdk/**(auto-generated) - Core Principle: A feature doesn't exist unless it is documented!
- Code Examples: Explain before showing code, make blocks fully runnable (include imports)
- Structure: Headers form navigation guide, logical H2/H3 hierarchy
- Content: User-focused sections, motivate features (why) before mechanics (how)
- Style: Prose over code comments for important information
Code review is about maintaining a healthy codebase while helping contributors succeed. The burden of proof is on the PR to demonstrate it adds value in the intended way. Your job is to help it get there through actionable feedback.
Critical: A perfectly written PR that adds unwanted functionality must still be rejected. The code must advance the codebase in the intended direction, not just be well-written. When rejecting, provide clear guidance on how to align with project goals.
Be friendly and welcoming while maintaining high standards. Call out what works well - this reinforces good patterns. When code needs improvement, be specific about why and how to fix it. Remember that PRs serve as documentation for future developers.
- Does this advance the codebase in the intended direction? (Even perfect code for unwanted features should be rejected)
- API design and naming clarity - Identify confusing patterns (e.g., parameter values that contradict defaults) or non-idiomatic code (mutable defaults, etc.). Contributed code will need to be maintained indefinitely, and by someone other than the author (unless the author is a maintainer).
- Suggest specific improvements, not generic "add more tests" comments
- Think about API ergonomics and learning curve from a user perspective
- Read the full context: Always examine related files, tests, and documentation before reviewing
- Check against established patterns: Look for consistency with existing codebase conventions
- Verify functionality claims: Don't just read code - understand what it actually does
- Consider edge cases: Think through error conditions and boundary scenarios
- Generic feedback without specifics
- Hypothetical problems unlikely to occur
- Nitpicking organizational choices without strong reason
- Summarizing what the PR already describes
- Star ratings or excessive emojis
- Bikeshedding style preferences when functionality is correct
- Requesting changes without suggesting solutions
- Focusing on personal coding style over project conventions
- Acknowledge good decisions ("This API design is clean")
- Be direct but respectful
- Explain impact ("This will confuse users because...")
- Remember: Someone else maintains this code forever
Before approving, ask yourself:
- Does this PR achieve its stated purpose?
- Is that purpose aligned with where the codebase should go?
- Would I be comfortable maintaining this code?
- Have I actually understood what it does, not just what it claims?
- Does this change introduce technical debt?
If something needs work, your review should help it get there through specific, actionable feedback. If it's solving the wrong problem, say so clearly.
Good Review Comments:
❌ "Add more tests"
✅ "The handle_timeout method needs tests for the edge case where timeout=0"
❌ "This API is confusing"
✅ "The parameter name data is ambiguous - consider message_content to match the MCP specification"
❌ "This could be better"
✅ "This approach works but creates a circular dependency. Consider moving the validation to utils/validators.py"
Before approving, verify:
- All required development workflow steps completed (uv sync, pre-commit, pytest)
- Changes align with repository patterns and conventions
- API changes are documented and backwards-compatible where possible
- Error handling follows project patterns (specific exception types)
- Tests cover new functionality and edge cases
git clone <repo>
cd fastmcp
uv sync # Installs all deps including dev tools- Linting:
uv run ruff check(or with--fix) - Type Checking:
uv run ty check - All Checks:
uv run pre-commit run --all-files
- Standard:
uv run pytest - Integration:
uv run pytest -m "integration" - Excluding markers:
uv run pytest -m "not integration and not client_process"
- Run server:
uv run fastmcp run server.py - Inspect server:
uv run fastmcp inspect server.py
- Never use bare
except- be specific with exception types - Use
# type: ignore[attr-defined]in tests for MCP results
- Dependencies: Always
uv syncfirst - Pre-commit fails: Run
uv run pre-commit run --all-filesto see failures - Type errors: Use
uv run ty checkdirectly, checkpyproject.tomlconfig - Test timeouts: Default 3s - optimize or mark as integration tests