Skip to content

Fluxa AI is a high-performance, asynchronous backend designed to orchestrate complex, hierarchical generative AI conversations. Built with FastAPI and PostgreSQL, it features a unique parent-child threading architecture that enables users to branch dialogues at any point while ensuring the AI retains full historical context.

Notifications You must be signed in to change notification settings

NafeesMadni/FluxaAI-Backend

Repository files navigation

FluxaAI-Backend

Backend service for building and managing AI-powered conversation threads. This project is written in Python 3.13+ using FastAPI with an asynchronous SQLAlchemy ORM for database interaction. It integrates with multiple large language model providers (Gemini, Claude, OpenAI) and includes utility support for Supabase storage.


🧱 Architecture & Components

  • app/ – Main application package
    • app.py – FastAPI application factory with async database engine and CORS middleware
    • config.py – Pydantic settings class for environment variables
    • deps.py – Dependency helpers (database session)
    • routes.py – Registers high‑level API routers
    • api/ – Endpoint implementations for threads, messages and model registry
    • database/ – SQLAlchemy models, Pydantic schemas and CRUD helpers
    • utils/ – Helpers for logging, Supabase, AI client registry and message formatting
  • main.py – Uvicorn entry point for development
  • testing.py – Quick script to dump LLM model metadata via litellm
  • migrations/ – Alembic migration scripts for PostgreSQL schema evolution

⭐ Key Features

  1. Threaded Conversations
    • Parent threads initiate a conversation.
    • Child threads branch off for continuations/contexts.
    • Automatic title and system prompt generation via LLMs.
  2. Message Management
    • Add user/assistant messages to threads.
    • Fetch chat history formatted for both API and LLM consumption.
  3. Multi‑Provider AI Support
    • Factory for Gemini, Claude, OpenAI clients using shared AsyncOpenAI interface.
    • Model registry endpoint exposing capabilities of popular LLMs.
  4. Persistent Storage
    • Async PostgreSQL database through SQLAlchemy with full CRUD support.
    • Supabase utilities for file upload/download/public URLs.
  5. Robust Error Handling & Logging
    • Centralized logger with rotating file and console handlers.
    • Detailed exception handling in each endpoint.
  6. Automatic API Docs – Accessible at /docs (Swagger UI) and /redoc.

⚙️ Getting Started

Requirements

  • Python 3.13+ (see pyproject.toml)
  • PostgreSQL database (asyncpg driver)
  • Environment variables (see below)
  1. Database Setup (Docker)

Run the following command to start the PostgreSQL 15 database instance:

docker run -d \
  --name fluxa_db \
  --restart always \
  -e POSTGRES_USER=fluxa_user \
  -e POSTGRES_PASSWORD=password123 \
  -e POSTGRES_DB=fluxa_db \
  -p 5432:5432 \
  -v postgres_data:/var/lib/postgresql/15/data \
  postgres:15
  1. Installation

uv venv                           # create virtual environment
source .venv/bin/activate         # activate
uc sync                           # install dependencies
  1. Database Migrations

Use Alembic (scripts included under migrations/):

alembic upgrade head

(ensure DATABASE_URL is set)

  1. Running the Service

uv run main.py           # development
  1. Running Tests / Utilities

  • There are no formal tests yet; run testing.py for a quick metadata dump:
python testing.py
  • Future tests can be added under a tests/ directory with pytest.

🧩 API Endpoints Summary

All routes are mounted under /api.

Model Registry

  • GET /api/models/capabilities – list supported LLMs and their features.

Threads

  • POST /api/parent-threads/create – start a new parent thread.

  • GET /api/parent-threads/get/{user_id} – list threads belonging to a user.

  • PATCH /api/parent-threads/update/title – change a parent thread title.

  • DELETE /api/parent-threads/delete/{thread_id} – remove a thread (and children).

  • POST /api/child-threads/create – create a continuation child thread.

  • PATCH /api/child-threads/update/title – rename a child thread.

  • DELETE /api/child-threads/delete/{thread_id} – delete a child thread.

Messages

  • POST /api/messages/add – append a message to parent/child thread.
  • POST /api/messages/get/{thread_id} – fetch messages (use ?child_thread=true if needed).

User Management (Basic)

  • /add-users, /get-users, /delete-users/{user_id} – simple user CRUD for testing.

The interactive docs show request/response models and examples.


🔐 Security & Authentication

Currently the API has no authentication layer; endpoints accept user IDs directly. Integrate OAuth/JWT or API key logic as needed before production.


📁 Environment Variables (Detail)

Variable Description
GEMINI_API_KEY Gemini LLM API key
ANTHROPIC_API_KEY Claude API key
OPENAI_API_KEY OpenAI API key
DEEPSEEK_API_KEY Additional provider key (unused?)
DATABASE_URL Async PG connection string
SUPABASE_URL Supabase project URL
SUPABASE_API_KEY Service role key for storage
BUCKET_NAME Supabase storage bucket
LOG_LEVEL Logging level (DEBUG/INFO/WARN/ERROR)

🛠️ Development Notes

  • Code style: follows PEP 8 with type hints.
  • Async SQLAlchemy sessions are passed via FastAPI dependencies.
  • LLM prompts are defined in app/api/prompts.py; adjust them to change AI behaviour.
  • client_registry.py centralizes LLM provider configuration.
  • Logging uses rotating files under logs/.

Consider adding:

  • Authentication middleware
  • Rate limiting
  • Unit/integration tests
  • Dockerfile / docker-compose for deployment (already present)

📄 License & Contributing

Add your license here (e.g., MIT) and contribution guidelines.

About

Fluxa AI is a high-performance, asynchronous backend designed to orchestrate complex, hierarchical generative AI conversations. Built with FastAPI and PostgreSQL, it features a unique parent-child threading architecture that enables users to branch dialogues at any point while ensuring the AI retains full historical context.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published