Skip to content

Cognipeer/console

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Cognipeer Console

Open-source, multi-tenant AI gateway for LLM services, agent orchestration, vector stores, RAG pipelines, prompt management, and more.

License

Community edition is available under AGPL-3.0. Commercial licensing, hosted deployments, and support agreements are available separately through Cognipeer.

Features

  • Multi-Tenant Architecture — Complete data isolation per tenant with per-tenant databases
  • LLM Gateway — OpenAI-compatible chat completions and embeddings with multi-provider support (OpenAI, Anthropic, AWS Bedrock, Google Vertex AI, vLLM, Ollama, and more)
  • Vector Store Management — Multi-provider vector operations with built-in SQLite vector support
  • RAG Pipeline — Document ingestion, chunking, embedding, and retrieval
  • Agent Tracing — Batch and streaming ingest with thread correlation
  • Guardrails — PII detection, content moderation, prompt shields, and custom LLM-based evaluators
  • Prompt Management — Versioned templates with environment-based deployment (dev/staging/prod)
  • Semantic Memory — Scoped memory stores with vector-based recall
  • File Management — Multi-provider file storage with automatic Markdown conversion
  • Inference Monitoring — Real-time monitoring for self-hosted inference servers
  • Alerts — Rule-based alerting with email notifications
  • Quota & Rate Limiting — Multi-dimensional quota enforcement with plan-level defaults

Quick Start

Prerequisites

  • Node.js 20+
  • npm

Installation

git clone https://github.com/Cognipeer/cognipeer-console.git
cd cognipeer-console
npm install
cp .env.example .env.local
npm run dev

The gateway starts with SQLite by default — no external database required.

Visit http://localhost:3000 to access the dashboard.

Docker

docker compose up -d

Or build and run manually:

docker build -t cognipeer-console .
docker run -p 3000:3000 -v ./data:/app/data -e JWT_SECRET=your-secret-here cognipeer-console

Architecture

┌────────────────────────────────────────────────────┐
│                    Next.js App                      │
├──────────────┬──────────────┬──────────────────────┤
│  Dashboard   │  Client API  │    Dashboard API     │
│   (UI)       │ /client/v1/* │    /api/*            │
├──────────────┴──────────────┴──────────────────────┤
│                  Middleware                         │
│         (JWT Auth + Feature Gates + CORS)           │
├────────────────────────────────────────────────────┤
│                 Service Layer                       │
│  Models │ Vector │ RAG │ Memory │ Tracing │ ...    │
├────────────────────────────────────────────────────┤
│              Provider Registry                      │
│  Contracts → Runtimes (LLM, Vector, File)          │
├────────────────────────────────────────────────────┤
│             Database Abstraction                    │
│           SQLite (default) │ MongoDB               │
├────────────────────────────────────────────────────┤
│               Core Infrastructure                   │
│  Config │ Logger │ Cache │ Resilience │ Health     │
└────────────────────────────────────────────────────┘

Technology Stack

  • Framework: Next.js 15 (App Router) + TypeScript
  • UI: Mantine v8 + Tailwind CSS
  • Database: SQLite (default, zero-dependency) or MongoDB
  • Auth: JWT (jose) + API tokens
  • Cache: None / Memory / Redis
  • Logging: Winston with structured context

Configuration

All configuration is managed through environment variables. See .env.example for the full list.

Key variables:

Variable Default Description
DB_PROVIDER sqlite Database backend (sqlite or mongodb)
JWT_SECRET Required. Secret for JWT signing
MAIN_DB_NAME cgate_main Main database name
CACHE_PROVIDER memory Cache backend (none, memory, redis)
CORS_ENABLED false Enable CORS for client APIs

For the full configuration reference, see the Configuration Guide.

Client API

The gateway exposes an OpenAI-compatible API at /api/client/v1/:

# Chat completion
curl -X POST http://localhost:3000/api/client/v1/chat/completions \
  -H "Authorization: Bearer <API_TOKEN>" \
  -H "Content-Type: application/json" \
  -d '{"model": "gpt-4", "messages": [{"role": "user", "content": "Hello"}]}'

# Embeddings
curl -X POST http://localhost:3000/api/client/v1/embeddings \
  -H "Authorization: Bearer <API_TOKEN>" \
  -H "Content-Type: application/json" \
  -d '{"model": "text-embedding-3-small", "input": "Hello world"}'

See openapi.yaml for the full API specification.

Official SDK

If you are building an application against Cognipeer Console, prefer the official TypeScript/JavaScript SDK:

Use this repository and its docs for platform setup, deployment, providers, tenancy, auth, and raw HTTP API semantics.

Documentation

Full documentation is available in the docs/ directory:

Build and preview the documentation site:

npm run docs:dev

Contributing

See CONTRIBUTING.md for development guidelines, code style, and PR checklist.

Security

Security reporting guidance is in SECURITY.md. Do not disclose vulnerabilities in public issues.

License

This repository is licensed under the GNU Affero General Public License v3.0. See LICENSE for the full text.

If you want to embed Cognipeer Console in a closed-source product, offer a proprietary hosted derivative without AGPL obligations, or purchase support/SLA coverage, see COMMERCIAL.md.

About

Operate inference, LLM Gateways, vector stores, tracing, guardrails, RAG, config, and incident workflows behind one production-ready console with tenant isolation built in.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages