Skip to content

Wire real tool calling: connect Python tools to the AI chat #3

@masonearl

Description

@masonearl

Current state

The Python tools in tools/ are standalone — they are not actually called during AI chat sessions. When the AI estimates a waterline job in the chat, it's making up numbers from training data, not running estimate_project_cost().

The OpenAI function-calling schemas for all tools are already defined in tools/registry.py.

What needs to happen

Wire api/chat.js to use OpenAI function calling with the schemas from tools/registry.py. When the model decides to call a tool:

  1. Parse the function call from the OpenAI response
  2. Call the appropriate Python tool via child_process.exec() or a small Python HTTP server sidecar
  3. Feed the result back into the conversation as a tool result message
  4. Let the model produce a final response using real computed numbers

Approach options

Option A — exec Python directly from Node
In api/chat.js, use child_process.execSync() to call a Python script with JSON args and capture stdout. Simple, no extra infrastructure.

Option B — Python FastAPI sidecar
Run a small Python server alongside Vercel dev that handles tool calls. More scalable but adds complexity.

Option A is probably the right starting point for Vercel serverless.

Files to look at

  • api/chat.js — where tool calling needs to be added
  • tools/registry.py — OpenAI-compatible tool schemas already defined
  • tools/__init__.py — all tools exported here

Why this matters

This is the difference between an AI that talks about construction and one that computes construction estimates. It's the most impactful open engineering task in the repo.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requesthelp wantedExtra attention is needed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions