A comprehensive semantic analysis platform that combines AST parsing, graph database storage, AI-powered semantic understanding, and natural language requirement processing to provide intelligent code analysis and automated technical solution generation.
- Clone and analyze Go repositories from GitHub URLs
- Advanced AST parsing with semantic understanding
- Extract functions, methods, interfaces, and relationships
- Store structured data in Neo4j graph database
- Real-time progress tracking and web interface
- Natural language requirement processing
- Semantic function classification and domain analysis
- Code-to-requirement mapping with confidence scoring
- Automated technical solution generation
- Context-aware code recommendations
- Automatic feature branch creation from requirements
- Implementation scaffolding generation
- TODO checklist and affected files analysis
- Automated commit workflows
- Interactive dashboard for analysis results
- Real-time progress monitoring
- Demo queries and visualization
- RESTful API endpoints
The project follows Domain-Driven Design (DDD) principles with clean separation of concerns:
go-analyzer/
├── 📋 spec/ # Project specifications
│ └── REQUIREMENTS.md # System requirements specification
├── ⚙️ config/ # Configuration files
│ └── config.yaml # Application configuration
├── 🏗️ domain/ # Business domains
│ ├── analysis/ # Core code analysis logic
│ ├── mapping/ # Requirement-to-code mapping
│ ├── solution/ # Technical solution generation
│ ├── requirements/ # Requirement processing orchestration
│ ├── git/ # Git operations & branch management
│ └── demo/ # Demo query generation
├── 💾 repo/ # Data access layer (Repository pattern)
│ ├── interfaces.go # Repository interfaces
│ ├── neo4j_repository.go # Neo4j implementation
│ └── repository_factory.go # Factory pattern for repositories
├── 🌐 api_service/ # HTTP interface layer
│ └── web_server.go # Web server and API endpoints
├── 🔧 shared/ # Shared models and configuration
│ ├── config.go # Configuration management
│ └── models.go # Common domain models
├── 📊 ast/ # AST parsing utilities
├── 🤖 prompts/ # AI prompt templates
├── 🎨 web/ # Static web assets
└── 📝 scripts/ # Utility scripts
- Domain-Driven Design: Clear domain boundaries and ubiquitous language
- Repository Pattern: Abstract data access with clean interfaces
- Dependency Injection: Loose coupling through interfaces
- Interface Segregation: Focused, single-responsibility interfaces
- Factory Pattern: Centralized object creation and configuration
- Go 1.23+ (automatically upgraded by dependencies)
- Neo4j Database (local or remote instance)
- Git command-line tool
- Docker (optional, for Neo4j)
# Using Docker (Recommended)
docker run -d \
--name neo4j \
-p 7474:7474 -p 7687:7687 \
-e NEO4J_AUTH=neo4j/password \
neo4j:latest
# Access Neo4j Browser: http://localhost:7474git clone <repository-url>
cd go-analyzer
go build -o go-analyzerEdit config/config.yaml or use command-line options:
database:
neo4j:
uri: "neo4j://localhost:7687"
username: "neo4j"
password: "password"
web:
port: "8080"
host: "localhost"
ai:
analysis:
enabled: true
model_name: "llama2"
endpoint: "http://localhost:11434/api/chat"# CLI Mode: Analyze a repository
./go-analyzer -mode cli -repo https://github.com/owner/repository
# Web Mode: Start web interface
./go-analyzer -mode web
# Then visit: http://localhost:8080Usage of ./go-analyzer:
-config string
Path to configuration file (default "config/config.yaml")
-mode string
Mode to run: cli or web (default "cli")
-port string
Web server port (overrides config)
-repo string
GitHub repository URL to analyze (CLI mode)# Analyze repository with default config
./go-analyzer -mode cli -repo https://github.com/gin-gonic/gin
# Use custom config and port
./go-analyzer -mode web -config config/prod.yaml -port 9000
# Process natural language requirements
./go-analyzer -mode cli -repo https://github.com/user/project
# Then use web interface for requirement processingWhen running in web mode, the following APIs are currently available:
GET /api/progress- Get real-time analysis progress and statisticsPOST /api/analyze- Start repository analysis (currently returns migration message)
Note: Additional endpoints for requirements processing, demo queries, and semantic function queries are planned as part of the ongoing DDD architecture implementation.
(:Function {
id: "unique-identifier",
name: "function-name",
package: "go-package-path",
file: "source-file-path",
signature: "complete-signature",
complexity: 5,
is_exported: true,
semantic_analysis: {
conceptual_purpose: "business logic description",
domain_keywords: ["auth", "user", "session"],
intent_classification: "BUSINESS_LOGIC",
confidence: 0.85
}
})CALLS: Function call relationshipsIMPLEMENTS: Interface implementationsDEPENDS_ON: Package dependencies
# Find high-complexity functions
MATCH (f:Function)
WHERE f.complexity > 10
RETURN f.name, f.complexity
ORDER BY f.complexity DESC
# Find business logic functions
MATCH (f:Function)
WHERE f.semantic_analysis.intent_classification = "BUSINESS_LOGIC"
RETURN f.name, f.semantic_analysis.conceptual_purpose
# Analyze function call patterns
MATCH (a:Function)-[:CALLS]->(b:Function)
RETURN a.package, COUNT(*) as calls
ORDER BY calls DESC-
domain/: Contains business logic organized by domain- Each domain has its own service interface and implementation
- Dependencies are injected through interfaces
- Domain models are kept separate from data models
-
repo/: Repository pattern implementation- Abstracts data access behind interfaces
- Neo4j-specific implementation details hidden
- Factory pattern for creating repositories
-
shared/: Cross-cutting concerns- Common models used across domains
- Configuration management
- Shared utilities and types
- New Domain: Create folder in
domain/with service interface - New Repository: Add interface to
repo/interfaces.go - New Model: Add to
shared/models.goif cross-domain - New API: Add endpoint to
api_service/web_server.go
# Build and test
go build -v
go test ./...
# Run with verbose output
./go-analyzer -mode web -config config/config.yamlThe system can process natural language requirements and map them to relevant code:
# Example requirement
"Add user authentication with JWT tokens"
# System will:
# 1. Analyze the requirement semantically
# 2. Map to relevant code sections
# 3. Generate implementation plan
# 4. Create feature branch with scaffoldingFunctions are automatically analyzed for:
- Conceptual Purpose: What the function does in business terms
- Domain Keywords: Relevant domain concepts
- Intent Classification: CRUD operations, business logic, utilities, etc.
- Integration Points: External systems and APIs
Pre-built queries for common analysis patterns:
- Function complexity analysis
- Package dependency mapping
- Semantic keyword distribution
- Cross-package integration points
-
Neo4j Connection Failed
# Check if Neo4j is running docker ps | grep neo4j # Verify connection settings in config/config.yaml
-
Build Errors
# Clean and rebuild go clean go mod tidy go build -v -
Analysis Stuck
# Check progress via web interface curl http://localhost:8080/api/progress
- Follow DDD principles when adding features
- Use repository interfaces for data access
- Add appropriate tests for new domains
- Update documentation for new APIs
- Ensure configuration remains flexible
[Add your license information here]
- Check the
spec/REQUIREMENTS.mdfor detailed specifications - Use the web interface for interactive analysis
- Monitor logs for debugging information
Generated documentation for Go Code Analyzer with Domain-Driven Design architecture