Skip to content

mcpslim/serena-slim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

serena-slim

Serena MCP server optimized for AI assistants — Reduce context window tokens by 50.3% while keeping full functionality. Compatible with Claude, ChatGPT, Gemini, Cursor, and all MCP clients.

npm version Test Status MCP Compatible

What is serena-slim?

A token-optimized version of the Serena Model Context Protocol (MCP) server.

The Problem

MCP tool schemas consume significant context window tokens. When AI assistants like Claude or ChatGPT load MCP tools, each tool definition takes up valuable context space.

The original serena loads 29 tools consuming approximately ~23,878 tokens — that's space you could use for actual conversation.

The Solution

serena-slim intelligently groups 29 tools into 18 semantic operations, reducing token usage by 50.3% — with zero functionality loss.

Your AI assistant sees fewer, smarter tools. Every original capability remains available.

Performance

Metric Original Slim Reduction
Tools 29 18 -38%
Schema Tokens 7,348 1,614 78.0%
Claude Code (est.) ~23,878 ~11,874 ~50.3%

Benchmark Info

  • Original: serena@0.0.1
  • Schema tokens measured with tiktoken (cl100k_base)
  • Claude Code estimate includes ~570 tokens/tool overhead

Quick Start

One-Command Setup (Recommended)

# Claude Desktop - auto-configure
npx serena-slim --setup claude

# Cursor - auto-configure
npx serena-slim --setup cursor

# Interactive mode (choose your client)
npx serena-slim --setup

Done! Restart your app to use serena.

CLI Tools (already have CLI?)

# Claude Code (creates .mcp.json in project root)
claude mcp add serena -s project -- npx -y serena-slim@latest

# Windows: use cmd /c wrapper
claude mcp add serena -s project -- cmd /c npx -y serena-slim@latest

# VS Code (Copilot, Cline, Roo Code)
code --add-mcp '{"name":"serena","command":"npx","args":["-y","serena-slim@latest"]}'

Manual Setup

Click to expand manual configuration options

Claude Desktop

Add to your claude_desktop_config.json:

OS Path
Windows %APPDATA%\Claude\claude_desktop_config.json
macOS ~/Library/Application Support/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "serena": {
      "command": "npx",
      "args": ["-y", "serena-slim@latest"]
    }
  }
}

Cursor

Add to .cursor/mcp.json (global) or <project>/.cursor/mcp.json (project):

{
  "mcpServers": {
    "serena": {
      "command": "npx",
      "args": ["-y", "serena-slim@latest"]
    }
  }
}

How It Works

MCPSlim acts as a transparent bridge between AI models and the original MCP server:

┌─────────────────────────────────────────────────────────────────┐
│  Without MCPSlim                                                │
│                                                                 │
│  [AI Model] ──── reads 29 tool schemas ────→ [Original MCP]    │
│             (~23,878 tokens loaded into context)                 │
├─────────────────────────────────────────────────────────────────┤
│  With MCPSlim                                                   │
│                                                                 │
│  [AI Model] ───→ [MCPSlim Bridge] ───→ [Original MCP]           │
│       │                │                      │                 │
│   Sees 18 grouped      Translates to        Executes actual   │
│   tools only         original call       tool & returns    │
│   (~11,874 tokens)                                              │
└─────────────────────────────────────────────────────────────────┘

How Translation Works

  1. AI reads slim schema — Only 18 grouped tools instead of 29
  2. AI calls grouped tool — e.g., interaction({ action: "click", ... })
  3. MCPSlim translates — Converts to original: browser_click({ ... })
  4. Original MCP executes — Real server processes the request
  5. Response returned — Result passes back unchanged

Zero functionality loss. 50.3% token savings.

Available Tool Groups

Group Actions
read 2
list 2
find 3
replace 2
get 2
insert 2
think 3
memory 3

Plus 10 passthrough tools — tools that don't group well are kept as-is with optimized descriptions.

Compatibility

  • Full functionality — All original serena features preserved
  • All AI assistants — Works with Claude, ChatGPT, Gemini, Copilot, and any MCP client
  • Drop-in replacement — Same capabilities, just use grouped action names
  • Tested — Schema compatibility verified via automated tests

FAQ

Does this reduce functionality?

No. Every original tool is accessible. Tools are grouped semantically (e.g., click, hover, draginteraction), but all actions remain available via the action parameter.

Why do AI assistants need token optimization?

AI models have limited context windows. MCP tool schemas consume tokens that could be used for conversation, code, or documents. Reducing tool schema size means more room for actual work.

Is this officially supported?

MCPSlim is a community project. It wraps official MCP servers transparently — the original server does all the real work.

License

MIT


Powered by MCPSlim — MCP Token Optimizer
Reduce AI context usage. Keep full functionality.

About

serena MCP (38% less tokens). Quick setup: npx serena-slim --setup

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors