Skip to content

fix: support multiple concurrent MCP client connections#295

Open
dddabtc wants to merge 1 commit intohangwin:masterfrom
dddabtc:fix/mcp-multi-session-transport
Open

fix: support multiple concurrent MCP client connections#295
dddabtc wants to merge 1 commit intohangwin:masterfrom
dddabtc:fix/mcp-multi-session-transport

Conversation

@dddabtc
Copy link

@dddabtc dddabtc commented Feb 7, 2026

Summary

  • The MCP Server (Protocol) instance was a singleton shared across all sessions. When a second client sent an initialize request to /mcp, Protocol.connect() threw "Already connected to a transport" because the singleton was already bound to the first client's transport.
  • Replaced the singleton getMcpServer() with a factory function createMcpServer(), so each new transport (both /mcp StreamableHTTP and /sse SSE) gets its own independent MCP Server instance.

Steps to reproduce (before fix)

# First request succeeds
curl -X POST "http://127.0.0.1:12306/mcp" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{"jsonrpc":"2.0","id":"1","method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'

# Second request fails with 500: "Already connected to a transport"
curl -X POST "http://127.0.0.1:12306/mcp" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{"jsonrpc":"2.0","id":"1","method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test2","version":"1.0"}}}'

Changes

File Change
app/native-server/src/mcp/mcp-server.ts Replace singleton getMcpServer() with factory createMcpServer()
app/native-server/src/server/index.ts Use createMcpServer() for each new transport in /mcp and /sse endpoints

Test plan

  • First MCP client initialize → succeeds
  • Second MCP client initialize (without session ID) → succeeds (previously 500 error)
  • Existing session reuse via mcp-session-id header → still works
  • SSE endpoint /sse with multiple clients → works independently

…urrent connections

The MCP Server (Protocol) instance was a singleton shared across all sessions.
When a second client sent an initialize request, Protocol.connect() threw
"Already connected to a transport" because the singleton was already bound
to the first client's transport.

Replace the singleton getMcpServer() with a factory createMcpServer()
so each new transport gets its own MCP Server instance, allowing
multiple clients to connect simultaneously via both /mcp and /sse endpoints.
@oshliaer
Copy link

oshliaer commented Feb 22, 2026

@dddabtc, your solution is good and works great!

I've checked.

┌───────────────┬─────────────────────────────────────────┐
│ Component     │ Value                                   │
├───────────────┼─────────────────────────────────────────┤
│ OS            │ Linux 6.8.0-100-generic (Ubuntu) x86_64 │
│ Node.js       │ v22.14.0                                │
│ npm           │ 11.8.0                                  │
│ pnpm          │ 10.29.2                                 │
│ Google Chrome │ 145.0.7632.109                          │
└───────────────┴─────────────────────────────────────────┘

However, today @PyEL666 proposed a more elegant solution #301 that is backward-compatible which seems cleaner:

  • Kept the getMcpServer() name (backward-compatible API)
  • Changed only the implementation: removed caching, made it a factory function
  • Added detailed comments explaining why this is safe
  • Does not require changes in server/index.ts -- calls remain the same

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants