-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathllms.txt
More file actions
196 lines (143 loc) · 5.69 KB
/
llms.txt
File metadata and controls
196 lines (143 loc) · 5.69 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
# Allama - AI Model Router
> Allama is your ultimate gateway to the world of AI, offering a powerful and intuitive API router that seamlessly integrates multiple AI model providers into one unified platform.
## Project Overview
Allama is a Go-based API router that provides a unified interface to multiple AI model providers including OpenAI, Anthropic, and Ollama. It acts as a middleware layer that aggregates different AI services under a single RESTful API, with Ollama-compatible endpoints for easy integration with existing tools.
## Key Features
- **Unified API**: Single endpoint for multiple AI providers (OpenAI, Anthropic, Ollama)
- **Provider Management**: Dynamic configuration and management of AI service providers
- **Model Aggregation**: Lists and interacts with models from various providers as if hosted locally
- **Database Integration**: SQLite-based storage for provider and model information
- **Ollama Compatibility**: Supports Ollama-specific endpoints for seamless integration
- **Request Forwarding**: Intelligent routing based on model selection
- **Response Transformation**: Converts provider responses to unified formats
## Architecture
### Core Components
1. **Router** (`internal/router/router.go`): Main HTTP request handler and routing logic
2. **Providers** (`internal/provider/`): Individual provider implementations (OpenAI, Anthropic, Ollama)
3. **Storage** (`internal/storage/`): SQLite database management for providers and models
4. **Config** (`internal/config/`): Environment-based configuration management
5. **Models** (`internal/models/`): Data structures for providers and models
### Technology Stack
- **Language**: Go 1.24.3
- **Web Framework**: Gin (HTTP router)
- **Database**: SQLite with go-sqlite3 driver
- **Configuration**: Environment variables with godotenv
- **Containerization**: Docker with multi-stage builds
## API Endpoints
### OpenAI-Compatible Endpoints
- `GET /api/v1/models` - List all available models
- `POST /api/v1/chat/completions` - Chat completions
### Ollama-Compatible Endpoints
- `GET /api/tags` - List model tags (Ollama format)
- `POST /api/show` - Show model information
- `POST /api/generate` - Generate text
- `POST /api/chat` - Chat interface
- `GET /api/version` - API version
### Health Check
- `GET /health` - Service health status
## Configuration
Environment variables (see `src/example.env`):
```env
# Server Configuration
PORT=8080
DATABASE_PATH=./allama.db
# OpenAI Provider
OPENAI_HOST=https://api.openai.com
IS_OPENAI_ACTIVE=false
OPENAI_API_KEY=
# Anthropic Provider
ANTHROPIC_HOST=https://api.anthropic.com
IS_ANTHROPIC_ACTIVE=false
ANTHROPIC_API_KEY=
# Ollama Provider
OLLAMA_HOST=http://localhost:11434
IS_OLLAMA_ACTIVE=true
```
## Development Setup
### Prerequisites
- Go 1.18+ (for building from source)
- Docker and Docker Compose (for containerized setup)
- Git
### Local Development
```bash
# Clone repository
git clone https://github.com/offbeat-studio/allama.git
cd allama
# Setup environment
cp src/example.env src/.env
# Edit src/.env with your API keys
# Run locally
cd src
go mod tidy
go run main.go
```
### Docker Setup
```bash
# Using Docker Compose
docker-compose -f Docker/docker-compose.yml up --build
# Using Makefile
make build
make up
```
## Database Schema
The application uses SQLite with tables for:
- **Providers**: Store AI service provider configurations
- **Models**: Store available models for each provider
Database is automatically initialized on startup with configured providers.
## Request Flow
1. Client sends request to Allama API endpoint
2. Router determines target provider based on model ID
3. Request is either:
- Forwarded directly to Ollama (if Ollama provider)
- Transformed and sent to OpenAI/Anthropic APIs
4. Response is transformed to match expected format
5. Unified response returned to client
## Provider Integration
### Adding New Providers
1. Create provider implementation in `internal/provider/`
2. Add provider configuration to `GetProviderConfigs()`
3. Implement required interfaces (GetModels, Chat methods)
4. Add environment variables for configuration
### Supported Providers
- **OpenAI**: GPT models via OpenAI API
- **Anthropic**: Claude models via Anthropic API
- **Ollama**: Local models via Ollama server
## Logging and Monitoring
- Request/response logging via custom middleware
- Logs stored in `src/logs/` directory with daily rotation
- Health check endpoint for monitoring
- Panic recovery with error reporting
## Security Considerations
- API keys stored as environment variables
- No API key validation in current implementation
- Database reset on each startup (development mode)
- Request forwarding preserves headers
## Deployment
### Docker Production
```bash
# Build optimized image
docker build -f Docker/Dockerfile -t allama:latest .
# Run with environment file
docker run -p 8080:8080 --env-file .env allama:latest
```
### Port Configuration
- Default: 8080 (configurable via PORT environment variable)
- Docker: 11435 (mapped in docker-compose.yml)
## Testing
- Unit tests available for transformer functionality
- Router tests for endpoint validation
- Manual testing via curl or HTTP clients
## Contributing
1. Fork the repository
2. Create feature branch
3. Follow Go best practices and conventions
4. Add tests for new functionality
5. Submit pull request with detailed description
## License
MIT License - Open source software for AI integration and routing.
## Contact
- Repository: https://github.com/offbeat-studio/allama
- Issues: GitHub Issues for bug reports and feature requests
- Maintainer: Offbeat Studio
---
*This documentation is generated for LLM consumption and provides comprehensive information about the Allama AI router project structure, functionality, and usage.*