Skip to content

Vaibtan/Agentic-Customer-Support-System

Repository files navigation

Agentic Customer Support

A multi-agent customer support system for technical support workflows.

Current Tech Stack

  • Frontend: Next.js 15 + React 19 (frontend/) with Chat UI, Admin Dashboard, and BFF API routes.
  • API runtime: FastAPI (src/api/app.py) with authn/authz, idempotent run creation, SSE events/replay, queue backpressure guard, and governance/admin endpoints.
  • Agent orchestration: Google ADK runner (src/services/run_executor.py) with supervisor routing and tool-call event mapping.
  • LLM access: LiteLLM gateway/proxy (src/services/llm_gateway.py, config/litellm-proxy.yaml) with per-agent model routing.
  • Workers: Run worker (src/workers/run_worker.py) and Webhook worker (src/workers/webhook_worker.py) over Redis Streams.
  • Data + queueing: PostgreSQL + pgvector (tenant data, messages, runs, RAG chunks/embeddings) and Redis (jobs/events/replay/dead-letter).
  • Observability: structlog, OpenTelemetry, Prometheus, Sentry (+ optional Langfuse integration).

Agentic Team

  • customer_support_agent (supervisor/router): routes to specialized agents via ADK transfers.
  • database_agent (SQL): ticket analytics/retrieval via validated, allowlisted read-only SQL tool.
  • knowledge_base_agent (Agentic RAG): hybrid retrieval over pgvector + keyword search with citations/sufficiency loop.
  • web_search_agent (web): external lookup fallback via DuckDuckGo search tool.
  • Guardrail path: input/output guardrails and budget checks enforced in runtime service layer.

System Architecture

Agentic Customer Support System Architecture

Development Focus

  • Safe-by-default execution: SQL safety validation, RAG/tool controls, guardrails, and encrypted message storage.
  • Correct async run semantics: idempotency, durable Redis Streams jobs/events, SSE replay, cancellation, retries, and dead-letter handling.
  • Multi-tenant governance: API key/OIDC auth, RBAC, quotas/model-tier controls, audit logs, retention, export/deletion proofs, and RLS policy support.
  • Operational visibility: structured logs, request correlation IDs, traces, metrics, and error monitoring.
  • Product surface: Next.js chat experience plus admin workflows for escalations, quota management, webhooks, and RLS operations.

How To Run Locally

  1. Prerequisites:
    • Docker + Docker Compose
    • Node.js 20+ and npm (for frontend dev server)
  2. Configure environment:
    • Copy .env.example to .env
    • Set at least OPENAI_API_KEY and database/redis defaults if changed
  3. Start backend stack (API + DB + Redis + LiteLLM proxy + workers):
    • docker compose up --build
  4. Verify backend readiness:
    • http://localhost:8000/healthz
    • http://localhost:8000/readyz
  5. Start frontend (separate terminal):
    • cd frontend
    • npm install
    • npm run dev
  6. Open app:
    • Chat UI: http://localhost:3000/chat
    • Admin dashboard: http://localhost:3000/dashboard

About

Multi-agent customer support platform using Google ADK, FastAPI, PostgreSQL + pgvector, and Redis Streams - featuring an agentic RAG pipeline, multi-tenant isolation, SSE streaming, and a full observability stack (LangFuse, Sentry, PostHog, OpenTelemetry).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors