Move Neurostack MCP to Hyperterse#24
Conversation
|
Attaching the migration plan below for reference, so any corrections / changes can be inferred from it. name: Hyperterse MCP Migration
Hyperterse MCP Migration PlanArchitecture: Bridge PatternOne MCP server (Hyperterse) is what Claude/Cursor connects to. The Python bridge is NOT an MCP server -- it is a private internal HTTP API that Hyperterse handlers call via Why This Works
Project StructureKey Implementation Details1. Python Bridge API (bridge/api.py)A single FastAPI app with one route per tool. Each route calls the same internal function that from fastapi import FastAPI
from neurostack.config import get_config
from neurostack.schema import DB_PATH, get_db
app = FastAPI()
cfg = get_config()
@app.post("/tools/vault-search")
async def vault_search(body: dict):
from neurostack.search import hybrid_search, tiered_search
# ... exact same logic as server.py vault_search ...
return result
@app.post("/tools/vault-ask")
async def vault_ask(body: dict):
from neurostack.ask import ask_vault
# ... exact same logic as server.py vault_ask ...
return result
# ... 19 more routes, each ~10-30 lines ...Every route mirrors the corresponding 2. Hyperterse Tool PatternEach of the 21 tools follows the same pattern: config.terse -- declares the MCP tool with its full input schema: name: vault-search
description: "Search the vault with tiered retrieval depth. ..."
handler: "./handler.ts"
inputs:
query:
type: string
description: "Natural language search query"
top_k:
type: int
description: "Number of results to return"
optional: true
default: "5"
mode:
type: string
description: 'Search mode - "hybrid", "semantic", or "keyword"'
optional: true
default: "hybrid"
depth:
type: string
description: 'Retrieval depth - "triples", "summaries", "full", or "auto"'
optional: true
default: "auto"
context:
type: string
description: "Optional project/domain context for boosting"
optional: true
workspace:
type: string
description: "Optional vault subdirectory prefix"
optional: true
auth:
plugin: allow_allhandler.ts -- thin proxy to the bridge: const BRIDGE_URL = "http://localhost:8100";
export default async function handler(payload: {
inputs: Record<string, unknown>;
tool: string;
}) {
const res = await fetch(`${BRIDGE_URL}/tools/vault-search`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(payload.inputs),
});
return await res.json();
}Every handler is the same structure -- only the endpoint path changes. The bridge URL is consistent across all handlers (hardcoded to 3. Start Script (start.sh)#!/usr/bin/env bash
# Start the Python bridge and Hyperterse MCP server together
# Start bridge in background
cd "$(dirname "$0")"
python3 bridge/api.py &
BRIDGE_PID=$!
# Wait for bridge to be ready
until curl -sf http://localhost:8100/health > /dev/null 2>&1; do sleep 0.2; done
# Start Hyperterse (foreground)
hyperterse start
# Cleanup on exit
kill $BRIDGE_PID 2>/dev/null4. Root Config (.hyperterse)name: neurostack
server:
port: 8080
log_level: 3
root: app
tools:
directory: tools
cache:
enabled: true
ttl: 3005. Caching StrategyHyperterse has built-in caching for DB-backed tools, but since all tools are script-backed (handler), caching bypasses the built-in cache. The Python bridge handles its own caching for 6. Environment VariablesSet in
7. Input Schema MappingEach Python MCP tool's parameters map 1:1 to Hyperterse
For Implementation Order
What Changes in Original CodebaseNothing. The bridge imports from |
Description
Based on this thread on Reddit, here's a migration of the current MCP setup to Hyperterse to check for reduced token usage.
The instructions to run are in mcp-hyperterse/README.md.
Feel free to close this PR if it is generating noise, and check out samrith-s/neurostack instead.
This is a brownfield approach to prevent rewriting all the scripts from Python to Typescript. This way, it supports incremental migration.
Checklist
pytest)ruff check)