AI-powered Python code analysis platform using Crew AI with multiple specialized agents.
The Agent Advisor Platform is a modern static code analysis platform for Python that uses multiple AI agents (Crew AI) to provide comprehensive analysis of:
- Code Quality: Detection of code smells, anti-patterns, and PEP8 violations
- Performance: Identification of bottlenecks and optimization suggestions
- Security: Analysis of vulnerabilities (OWASP) and insecure practices
- Documentation: Verification of docstrings, type hints, and comments
- Complexity: Cyclomatic complexity metrics and maintainability index
- Synchronous Analysis: Immediate response for small snippets (<100 lines)
- Asynchronous Analysis: Background processing for large files via Celery
- Multiple Agents: 4 specialized agents working in parallel
- AST Analysis: Native Python Abstract Syntax Tree analysis
- Complexity Metrics: Radon for cyclomatic complexity calculation
- Maintainability Index: Maintainability score (0-100)
- Security Scanning: Detection of common vulnerabilities (SQL Injection, XSS, etc.)
- Multi-Layer Cache: In-memory LRU + Redis + PostgreSQL
- Horizontal Scaling: Celery workers can be scaled independently
- Connection Pooling: SQLAlchemy async pool
- Rate Limiting: Configurable per endpoint
| Metric | Value | Description |
|---|---|---|
| Overall Coverage | 47.88% | 207+ comprehensive tests |
| Unit Tests | 113 | Fast, isolated tests |
| Integration Tests | 94 | API, database, and cache tests |
| Lines of Code | ~5,000 | Well-structured Python 3.11+ codebase |
| Services Coverage | 90%+ | AST, Complexity, Code Analyzers |
| Repositories Coverage | 85%+ | Database operations with JSONB |
| API Coverage | 85%+ | 39 endpoint tests |
- Docker 24+ and Docker Compose 2.20+
- Python 3.11+ (for local development)
- OpenAI API Key
# 1. Clone the repository
git clone https://github.com/willianpinho/agent-advisor-platform.git
cd agent-advisor-platform
# 2. Configure environment variables
cp .env.example .env
# Edit .env and add your OPENAI_API_KEY
# 3. Start all services
docker-compose up -d
# 4. Wait for health checks (1-2 minutes)
docker-compose ps
# 5. Test the API
curl http://localhost:8000/health- API: http://localhost:8000
- API Docs: http://localhost:8000/docs
- Flower (Celery): http://localhost:5555
- RabbitMQ: http://localhost:15672 (guest/guest)
- PgAdmin: http://localhost:5050 (admin@admin.com/admin)
curl -X POST "http://localhost:8000/api/v1/analysis/sync" \
-H "Content-Type: application/json" \
-d '{
"source_code": "def hello():\n print(\"Hello World\")",
"filename": "example.py",
"analysis_type": "full"
}'The platform follows Clean Architecture principles with Hexagonal Architecture:
βββββββββββββββββββββββββββββββββββββββββββββββ
β API Layer (FastAPI) β
β /analysis/sync β /analysis/async β etc β
βββββββββββββββββββ¬ββββββββββββββββββββββββββββ
β
βββββββββββββββββββΌββββββββββββββββββββββββββββ
β Service Layer β
β CodeAnalyzer β CrewAI Orchestrator β
β AST, Complexity, Security Analyzers β
βββββββββββββββββββ¬ββββββββββββββββββββββββββββ
β
βββββββββββββββββββΌββββββββββββββββββββββββββββ
β Repository Layer β
β AnalysisRepository (CRUD + Queries) β
βββββββββββββββββββ¬ββββββββββββββββββββββββββββ
β
βββββββββββββββββββΌββββββββββββββββββββββββββββ
β Infrastructure Layer β
β PostgreSQL β Redis β RabbitMQ β
βββββββββββββββββββββββββββββββββββββββββββββββ
| Category | Technologies |
|---|---|
| Backend | FastAPI 0.109+, Python 3.11+, Uvicorn |
| Database | PostgreSQL 16, SQLAlchemy 2.0 (async) |
| Cache & Queue | Redis 7, Celery 5.3, RabbitMQ 3 |
| AI & Analysis | Crew AI 0.11+, OpenAI GPT-4, Radon |
| Validation | Pydantic V2, Type Hints, MyPy |
| Testing | pytest, pytest-asyncio, httpx |
| Infrastructure | Docker, Docker Compose, Poetry |
Complete documentation is available in VitePress format:
- Getting Started Guide - Installation and setup
- Quick Start - 5-minute guide
- Installation Guide - Detailed installation instructions
- System Architecture - Detailed architecture design
- Project Structure - Code organization
- Design Patterns - Patterns used
- Testing Guide - Test suite documentation
- Docker Guide - Docker deployment
- Celery Workers - Background task processing
- Redis Cache - Caching strategies
- Crew AI Integration - AI agent setup
- REST API Reference - Complete API documentation
- Basic Usage Examples - Usage examples
# Install dependencies
npm install
# Start documentation server
npm run docs:dev
# Build documentation
npm run docs:build
# Access at http://localhost:5173# Install Poetry
curl -sSL https://install.python-poetry.org | python3 -
# Install dependencies
poetry install
# Start infrastructure
docker-compose up -d postgres redis rabbitmq
# Run migrations
poetry run alembic upgrade head
# Start API server
poetry run uvicorn src.main:app --reload
# Start Celery worker (in another terminal)
poetry run celery -A src.workers.celery_app worker --loglevel=info# All tests
poetry run pytest
# With coverage
poetry run pytest --cov=src
# Unit tests only
poetry run pytest tests/unit/
# Integration tests only
poetry run pytest tests/integration/# Linting
poetry run ruff check .
# Formatting
poetry run black .
poetry run isort .
# Type checking
poetry run mypy src/agent-advisor-platform/
βββ src/
β βββ api/ # FastAPI routes and dependencies
β βββ core/ # Configuration and settings
β βββ domain/ # Models, schemas, enums
β βββ services/ # Business logic and analyzers
β βββ repositories/ # Data access layer
β βββ infrastructure/ # Database, cache, queue
β βββ workers/ # Celery workers
β βββ main.py # Application entry point
βββ tests/
β βββ unit/ # Unit tests (113 tests)
β βββ integration/ # Integration tests (94 tests)
β βββ conftest.py # Shared fixtures
βββ docs/ # VitePress documentation
βββ scripts/ # Utility scripts
βββ docker-compose.yml # Docker orchestration
βββ pyproject.toml # Poetry configuration
βββ README.md # This file
We welcome contributions! Please:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Input validation with Pydantic V2
- SQL injection prevention via SQLAlchemy ORM
- Environment-based secrets management
- CORS configuration
- Rate limiting support
This project is licensed under the MIT License - see the LICENSE file for details.
Built with modern technologies:
- FastAPI - Modern web framework
- Crew AI - Multi-agent orchestration
- SQLAlchemy - Python SQL toolkit
- Celery - Distributed task queue
- PostgreSQL - Advanced database
- Redis - In-memory data store
- Documentation: View Docs
- Issues: GitHub Issues
- Discussions: GitHub Discussions
Developed with β€οΈ for the Technical Innovation Challenge