Notion MCP server optimized for AI assistants — Reduce context window tokens by 73.0% while keeping full functionality. Compatible with Claude, ChatGPT, Gemini, Cursor, and all MCP clients.
A token-optimized version of the Notion Model Context Protocol (MCP) server.
MCP tool schemas consume significant context window tokens. When AI assistants like Claude or ChatGPT load MCP tools, each tool definition takes up valuable context space.
The original @notionhq/notion-mcp-server loads 21 tools consuming approximately ~26,073 tokens — that's space you could use for actual conversation.
notion-slim intelligently groups 21 tools into 10 semantic operations, reducing token usage by 73.0% — with zero functionality loss.
Your AI assistant sees fewer, smarter tools. Every original capability remains available.
| Metric | Original | Slim | Reduction |
|---|---|---|---|
| Tools | 21 | 10 | -52% |
| Schema Tokens | 14,103 | 1,352 | 90.4% |
| Claude Code (est.) | ~26,073 | ~7,052 | ~73.0% |
Benchmark Info
- Original:
@notionhq/notion-mcp-server@2.0.0- Schema tokens measured with tiktoken (cl100k_base)
- Claude Code estimate includes ~570 tokens/tool overhead
# Claude Desktop - auto-configure
npx notion-slim --setup claude
# Cursor - auto-configure
npx notion-slim --setup cursor
# Interactive mode (choose your client)
npx notion-slim --setupDone! Restart your app to use notion.
⚠️ This MCP requires environment variables. The setup will add placeholders - update them with your actual values. See Configuration.
# Claude Code (creates .mcp.json in project root)
claude mcp add notion -s project --env NOTION_API_KEY=<YOUR_KEY> -- npx -y notion-slim@latest
# Windows: use cmd /c wrapper
claude mcp add notion -s project --env NOTION_API_KEY=<YOUR_KEY> -- cmd /c npx -y notion-slim@latest
# VS Code (Copilot, Cline, Roo Code)
code --add-mcp '{"name":"notion","command":"npx","args":["-y","notion-slim@latest"],"env":{"NOTION_API_KEY":"<YOUR_KEY>"}}'| Variable | Description | Required |
|---|---|---|
NOTION_API_KEY |
Notion Integration Token (from notion.so/my-integrations) | Yes |
Click to expand manual configuration options
Add to your claude_desktop_config.json:
| OS | Path |
|---|---|
| Windows | %APPDATA%\Claude\claude_desktop_config.json |
| macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
{
"mcpServers": {
"notion": {
"command": "npx",
"args": ["-y", "notion-slim@latest"],
"env": {
"NOTION_API_KEY": "<YOUR_KEY>"
}
}
}
}Add to .cursor/mcp.json (global) or <project>/.cursor/mcp.json (project):
{
"mcpServers": {
"notion": {
"command": "npx",
"args": ["-y", "notion-slim@latest"],
"env": {
"NOTION_API_KEY": "<YOUR_KEY>"
}
}
}
}MCPSlim acts as a transparent bridge between AI models and the original MCP server:
┌─────────────────────────────────────────────────────────────────┐
│ Without MCPSlim │
│ │
│ [AI Model] ──── reads 21 tool schemas ────→ [Original MCP] │
│ (~26,073 tokens loaded into context) │
├─────────────────────────────────────────────────────────────────┤
│ With MCPSlim │
│ │
│ [AI Model] ───→ [MCPSlim Bridge] ───→ [Original MCP] │
│ │ │ │ │
│ Sees 10 grouped Translates to Executes actual │
│ tools only original call tool & returns │
│ (~7,052 tokens) │
└─────────────────────────────────────────────────────────────────┘
- AI reads slim schema — Only 10 grouped tools instead of 21
- AI calls grouped tool — e.g.,
interaction({ action: "click", ... }) - MCPSlim translates — Converts to original:
browser_click({ ... }) - Original MCP executes — Real server processes the request
- Response returned — Result passes back unchanged
Zero functionality loss. 73.0% token savings.
| Group | Actions |
|---|---|
get |
4 |
post |
2 |
patch |
2 |
retrieve |
5 |
update |
2 |
create |
2 |
Plus 4 passthrough tools — tools that don't group well are kept as-is with optimized descriptions.
- ✅ Full functionality — All original
@notionhq/notion-mcp-serverfeatures preserved - ✅ All AI assistants — Works with Claude, ChatGPT, Gemini, Copilot, and any MCP client
- ✅ Drop-in replacement — Same capabilities, just use grouped action names
- ✅ Tested — Schema compatibility verified via automated tests
No. Every original tool is accessible. Tools are grouped semantically (e.g., click, hover, drag → interaction), but all actions remain available via the action parameter.
AI models have limited context windows. MCP tool schemas consume tokens that could be used for conversation, code, or documents. Reducing tool schema size means more room for actual work.
MCPSlim is a community project. It wraps official MCP servers transparently — the original server does all the real work.
MIT
Powered by MCPSlim — MCP Token Optimizer
Reduce AI context usage. Keep full functionality.