A multi-agent customer support system for technical support workflows.
- Frontend: Next.js 15 + React 19 (
frontend/) with Chat UI, Admin Dashboard, and BFF API routes. - API runtime: FastAPI (
src/api/app.py) with authn/authz, idempotent run creation, SSE events/replay, queue backpressure guard, and governance/admin endpoints. - Agent orchestration: Google ADK runner (
src/services/run_executor.py) with supervisor routing and tool-call event mapping. - LLM access: LiteLLM gateway/proxy (
src/services/llm_gateway.py,config/litellm-proxy.yaml) with per-agent model routing. - Workers: Run worker (
src/workers/run_worker.py) and Webhook worker (src/workers/webhook_worker.py) over Redis Streams. - Data + queueing: PostgreSQL + pgvector (tenant data, messages, runs, RAG chunks/embeddings) and Redis (jobs/events/replay/dead-letter).
- Observability: structlog, OpenTelemetry, Prometheus, Sentry (+ optional Langfuse integration).
customer_support_agent(supervisor/router): routes to specialized agents via ADK transfers.database_agent(SQL): ticket analytics/retrieval via validated, allowlisted read-only SQL tool.knowledge_base_agent(Agentic RAG): hybrid retrieval over pgvector + keyword search with citations/sufficiency loop.web_search_agent(web): external lookup fallback via DuckDuckGo search tool.- Guardrail path: input/output guardrails and budget checks enforced in runtime service layer.
- Safe-by-default execution: SQL safety validation, RAG/tool controls, guardrails, and encrypted message storage.
- Correct async run semantics: idempotency, durable Redis Streams jobs/events, SSE replay, cancellation, retries, and dead-letter handling.
- Multi-tenant governance: API key/OIDC auth, RBAC, quotas/model-tier controls, audit logs, retention, export/deletion proofs, and RLS policy support.
- Operational visibility: structured logs, request correlation IDs, traces, metrics, and error monitoring.
- Product surface: Next.js chat experience plus admin workflows for escalations, quota management, webhooks, and RLS operations.
- Prerequisites:
- Docker + Docker Compose
- Node.js 20+ and npm (for frontend dev server)
- Configure environment:
- Copy
.env.exampleto.env - Set at least
OPENAI_API_KEYand database/redis defaults if changed
- Copy
- Start backend stack (API + DB + Redis + LiteLLM proxy + workers):
docker compose up --build
- Verify backend readiness:
http://localhost:8000/healthzhttp://localhost:8000/readyz
- Start frontend (separate terminal):
cd frontendnpm installnpm run dev
- Open app:
- Chat UI:
http://localhost:3000/chat - Admin dashboard:
http://localhost:3000/dashboard
- Chat UI:
