Pistom is a lightweight, zero-configuration server that connects any Model-Context-Protocol (MCP) compatible client (like an LLM agent) to the Piston API. It provides a secure and sandboxed environment for executing Python code on-demand.
Pistom acts as a simple, stateless proxy. It receives a code execution request from an MCP client, forwards it to the public Piston API, and then formats Piston's response back into the MCP specification for the client.
sequenceDiagram
participant Client as LLM Client (MCP)
participant Server as Pistom Server
participant Piston as Piston API
Client->>Server: Start Server Request
activate Server
Note over Client, Piston: Pistom is now listening for requests
Client->>Server: Execute Code Request<br>{"code": "print(1+1)"}
Server->>Piston: POST /api/v2/piston/execute<br>(with code and python settings)
activate Piston
Piston-->>Server: Raw JSON Response<br>{"run": {"output": "2\n", ...}}
deactivate Piston
Server-->>Client: Formatted MCP Response<br>[{"type": "text", "text": "2\n"}]
deactivate Server
- On-Demand Code Execution: Instantly grant any LLM the ability to run Python code.
- Zero-Configuration: No API keys, sign-ups, or environment variables needed. It just works.
- Secure by Design: Code is executed in a remote, sandboxed environment provided by Piston, eliminating risks to your local machine.
To use Pistom with a compatible client, add it to the client's MCP server configuration. The client will then be able to start and communicate with Pistom automatically.
Here is the configuration snippet:
"pistom": {
"url": "https://pistom.fastmcp.app/mcp"
}That's it! Your client is now empowered with Python code execution capabilities.
Warning
The public Piston API is rate-limited.
"The Piston API is rate limited to 5 requests per second..."
— Piston's documentation
Please be mindful of this limit in your applications.
Thanks to:
- Piston for the the free, public code execution engine.
- Model-Context-Protocol (MCP) for the protocol and SDK.
Contributions are welcome! If you have a feature request, bug report, or pull request, please feel free to open an issue or submit a PR on the GitHub repository.
This project is licensed under the MIT License.