The MCP-Native AI Gateway
Route requests to any AI provider through one universal endpoint.
Self-hosted. Open source. BYOK.
Styx is an open-source AI gateway that sits between your app and AI providers. Send requests to OpenAI, Anthropic, Google, or Mistral — all through one OpenAI-compatible endpoint. Bring your own API keys, self-host on your infra, and get full visibility into every request.
The first AI gateway with native MCP (Model Context Protocol) support.
from openai import OpenAI
client = OpenAI(
api_key="your-styx-api-key",
base_url="http://localhost:8080/v1", # ← Only change needed
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello from Styx"}],
)- 🔌 MCP Native — Built-in MCP server. Connect Claude Code or Cursor in one command
- 🔀 Universal Routing — One OpenAI-compatible endpoint for all providers
- 🤖 styx:auto — Intelligent model routing: use
"model": "styx:auto"and let Styx pick the right model based on request complexity - 🔑 BYOK — Bring your own API keys, encrypted at rest (Fernet/AES)
- 📊 Dashboard — Track requests, costs, latency per project and model
- 🔄 Fallbacks — Auto-failover between providers with circuit breakers
- 💰 Billing — Built-in subscription and credit-based billing (Stripe)
- 🧠 Semantic Cache — Similar questions return cached responses instantly
- ⚡ Smart Routing — ML classifier routes to the optimal model for each request
- 🐳 Self-Hosted — Docker Compose, 5-minute setup
- 🔒 Secure — HMAC key hashing, Fernet encryption, rate limiting, TLS
- Docker Engine 24+ and Docker Compose v2
- At least one AI provider API key (OpenAI, Anthropic, Google, or Mistral)
- Supabase account (free tier) — only for production mode (not needed for dev mode)
git clone https://github.com/timmx7/styx.git
cd styx
./setup.sh # interactive wizard, generates .env
docker compose up -d --build # first build: ~15-20 min; subsequent starts: ~60sThe wizard lets you choose between:
- Dev mode — No Supabase needed, no authentication, instant start
- Production mode — Full Supabase auth, account creation, API keys
git clone https://github.com/timmx7/styx.git
cd styx
cp .env.example .envEdit .env with:
- Set
SKIP_AUTH=truefor dev mode, or configure Supabase for production - At least one AI provider key (e.g.,
OPENAI_API_KEY)
docker compose up -d --build # first build: ~15-20 min; subsequent starts: ~60s- Dashboard: http://localhost:3000
- API Gateway: http://localhost:8080 (direct) or https://localhost/v1 (via nginx/TLS)
- Docs API: https://localhost/api (via nginx)
claude mcp add styx -- npx styx-mcpAdd to .cursor/mcp.json:
{
"styx": {
"command": "npx",
"args": ["styx-mcp"],
"env": { "STYX_API_KEY": "your-key" }
}
}curl -X POST http://localhost:8080/v1/chat/completions \
-H "Authorization: Bearer YOUR_STYX_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [{"role": "user", "content": "Hello from Styx"}]
}'Dev mode: skip the
Authorizationheader — requests are accepted without an API key.
// Node.js / TypeScript
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "your-styx-key",
baseURL: "http://localhost:8080/v1",
});# Python
from openai import OpenAI
client = OpenAI(
api_key="your-styx-key",
base_url="http://localhost:8080/v1",
)| Provider | Models | Status |
|---|---|---|
| OpenAI | gpt-4.1, gpt-4.1-mini, gpt-4o, gpt-4o-mini, o3, o4-mini | ✅ |
| Anthropic | claude-sonnet-4, claude-3-5-sonnet, claude-3-5-haiku, claude-3-haiku | ✅ |
| gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite, gemini-2.0-flash | ✅ | |
| Mistral | mistral-large, mistral-medium-3, mistral-small, codestral | ✅ |
| Azure OpenAI | Same as OpenAI models, via Azure deployments | ✅ |
Auto-routing: Any model matching the provider prefixes above (
gpt-*,claude-*,gemini-*,mistral-*,o3*,o4*) is routed automatically — even models released after your last config update.
┌─────────┐ ┌──────────────┐ ┌───────────────┐
│ Client │────▶│ Go Router │────▶│ AI Provider │
│ (app) │◀────│ (port 8080) │◀────│ (OpenAI...) │
└─────────┘ └──────┬───────┘ └───────────────┘
│
┌──────▼───────┐
│ Python API │
│ (port 8000) │
│ Auth/Billing │
└──────┬───────┘
│
┌────────────┼────────────┐
│ │ │
┌─────▼──┐ ┌──────▼──┐ ┌─────▼──┐
│Postgres│ │ Redis │ │ Next.js│
│ (DB) │ │ (cache) │ │ (UI) │
└────────┘ └─────────┘ └────────┘
Request flow:
Client request
│
▼
Go Router (:8080) ──▶ Cache check ──▶ HIT? Return instantly
│ MISS? Continue...
▼
Budget check ──▶ OVER LIMIT? Block + alert
│ OK? Continue...
▼
Route to best provider (OpenAI / Anthropic / Google / Mistral)
│
▼
Provider error? ──▶ Automatic fallback (circuit breaker)
│
▼
Response to client + log to ClickHouse + update Redis counters
styx/
├── router/ # Go reverse proxy — the fast path (<10ms overhead)
├── backend/ # Python FastAPI — auth, billing, business logic
├── dashboard/ # Next.js + Tailwind — web dashboard
├── classifier/ # ML request classifier (complexity scoring)
├── cache-service/ # Semantic cache (Qdrant + sentence-transformers)
├── sdk/ # Python & Node.js client SDKs
├── packages/ # MCP server, gateway CLI
├── infra/ # Docker, Helm, K8s, k6 load tests, Prometheus
└── docker-compose.yml
| Feature | Styx | OpenRouter | LiteLLM | Portkey |
|---|---|---|---|---|
| MCP Native | ✅ | ❌ | ❌ | ❌ |
| Self-Hosted | ✅ | ❌ | ✅ | ❌ |
| Open Source | ✅ Apache 2.0 | ❌ | ✅ | ❌ |
| Dashboard | ✅ Full | ❌ | Basic | ✅ |
| BYOK | ✅ Encrypted | ❌ | ✅ | ✅ |
| Semantic Cache | ✅ | ❌ | ❌ | ❌ |
| Smart Routing | ✅ ML | ❌ | ❌ | ✅ |
| Circuit Breaker | ✅ | ❌ | ✅ | ✅ |
| One-Command Install | ✅ | N/A | ❌ | N/A |
Install the Styx plugin directly in Claude Code:
/plugin install styx@claude-plugin-directory
Or browse: /plugin > Discover > styx
This gives you /styx:setup, /styx:status, and the @styx-ops agent for managing your gateway from Claude Code.
Styx includes a native MCP server. Connect it to Claude, Cursor, or any MCP-compatible client.
Claude Code:
claude mcp add styx -- npx styx-mcpCursor: Add to .cursor/mcp.json:
{
"styx": {
"command": "npx",
"args": ["styx-mcp"],
"env": { "STYX_API_KEY": "your-key" }
}
}Connect to a hosted Styx instance without local installation:
Claude.ai / Claude Desktop:
Settings > Connectors > Add custom connector > URL: https://mcp.styxhq.com/mcp
Claude Code:
claude mcp add --transport http styx https://mcp.styxhq.com/mcpSee docs/DEPLOY_MCP_REMOTE.md for self-hosting the remote MCP server.
Prompt: "Check if my AI gateway is healthy and which providers are connected" → Styx checks all provider connections, returns status and latency per provider, flags any issues.
Prompt: "How much have I spent on AI APIs this month?" → Styx aggregates usage across providers, returns cost breakdown by model, shows cache savings.
Prompt: "Create an API key for the marketing team limited to 1000 requests/day" → Styx generates a rate-limited key, returns the key and its configuration.
We welcome contributions! Please see CONTRIBUTING.md.
Apache 2.0 — see LICENSE for details.