The simplest way to connect to MCP servers. Two lines to connect. One line to call tools.
import { connect } from "mcpwire";
const server = await connect("http://localhost:3000/mcp");
const result = await server.callTool("search", { query: "hello" });No transport configuration. No protocol negotiation. No boilerplate. Just connect and go.
The official MCP SDK is powerful but verbose. Connecting to a server takes 30+ lines of setup code. mcpwire wraps it in a clean, ergonomic API so you can focus on building, not configuring.
| mcpwire | Official SDK | |
|---|---|---|
| Lines to connect | 2 | 15-30+ |
| Auto transport detection | Yes | Manual |
| OpenAI/Anthropic tool format | Built-in | DIY |
| Server discovery | Built-in | None |
| Learning curve | 3 methods | 20+ classes |
npm install mcpwireimport { connect } from "mcpwire";
const server = await connect("http://localhost:3000/mcp");
// List tools
const tools = await server.tools();
console.log(tools);
// Call a tool
const result = await server.callTool("get_weather", { city: "NYC" });
console.log(result.content[0].text);
// Read a resource
const docs = await server.readResource("file:///README.md");
console.log(docs[0].text);
// Clean up
await server.close();import { connectStdio } from "mcpwire";
const server = await connectStdio("npx", [
"-y",
"@modelcontextprotocol/server-filesystem",
"/home/user/documents",
]);
const files = await server.resources();
console.log(files);import { connect } from "mcpwire";
import OpenAI from "openai";
const server = await connect("http://localhost:3000/mcp");
const openai = new OpenAI();
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "What's the weather in NYC?" }],
tools: await server.toolsForOpenAI(),
});
// Execute tool calls
for (const call of response.choices[0].message.tool_calls || []) {
const result = await server.callTool(
call.function.name,
JSON.parse(call.function.arguments)
);
console.log(result);
}import { connect } from "mcpwire";
import Anthropic from "@anthropic-ai/sdk";
const server = await connect("http://localhost:3000/mcp");
const anthropic = new Anthropic();
const response = await anthropic.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{ role: "user", content: "What's the weather in NYC?" }],
tools: await server.toolsForAnthropic(),
});import { discover } from "mcpwire";
// Find servers configured in Claude Desktop, Cursor, etc.
const servers = discover();
for (const info of servers) {
console.log(`${info.name}: ${info.url || info.command}`);
}Connect to an MCP server over HTTP. Auto-detects Streamable HTTP vs SSE transport.
Connect to a local MCP server process over stdio.
Find MCP servers configured on your machine (Claude Desktop, Cursor, etc.).
| Method | Returns | Description |
|---|---|---|
tools() |
Tool[] |
List available tools |
callTool(name, args) |
ToolResult |
Call a tool |
resources() |
Resource[] |
List available resources |
readResource(uri) |
ResourceContent[] |
Read a resource |
prompts() |
Prompt[] |
List available prompts |
getPrompt(name, args) |
PromptMessage[] |
Get a prompt |
toolsForOpenAI() |
OpenAI tool format | Tools formatted for OpenAI API |
toolsForAnthropic() |
Anthropic tool format | Tools formatted for Anthropic API |
close() |
void | Disconnect |
mcpwire automatically handles transport selection:
- Streamable HTTP (default) - Recommended for remote servers
- SSE (fallback) - Legacy HTTP+SSE transport
- stdio - For local process-spawned servers
// Force a specific transport
const server = await connect(url, { transport: "sse" });Contributions welcome! Please read CONTRIBUTING.md first.
git clone https://github.com/ctonneslan/mcpwire.git
cd mcpwire
npm install
npm run buildMIT
Test MCP servers directly from your terminal:
# Install globally
npm install -g mcpwire
# List tools on a server
mcpwire http://localhost:3000/mcp tools
# Call a tool
mcpwire http://localhost:3000/mcp call get_weather '{"city":"NYC"}'
# List resources
mcpwire http://localhost:3000/mcp resources
# Find configured servers
mcpwire discover- 0.4.0 - Vercel AI SDK integration (
toolsForVercelAI), better error messages - 0.3.0 - CLI tool (
npx mcpwire) - 0.2.0 - Google Gemini support (
toolsForGemini) - 0.1.0 - Initial release