Production-ready FastAPI microservice for controlling Raspberry Pi Camera (libcamera/Picamera2) with H.264 streaming to MediaMTX via RTSP.
Version 2.8.1 - System logs API with real-time streaming for remote debugging!
π₯ New in v2.8.1: System Logs API - Remote log access via
/v1/system/logswith filtering (lines, level, search) + real-time streaming via Server-Sent Events (SSE) at/v1/system/logs/stream. Perfect for remote debugging and monitoring!π₯ v2.8.0: Advanced Controls Status Tracking - Camera status endpoint now exposes all advanced control settings in real-time (EV compensation, noise reduction mode, AE constraint/exposure modes). Perfect for debugging and monitoring camera configuration!
π₯ v2.7.0: Wide-Angle Camera Detection - Automatic detection of Camera Module 3 Wide (120Β° FOV). Intelligent sensor mode selection preserves full wide-angle field of view at all resolutions. New API fields expose camera type, sensor modes, and recommended resolutions!
π₯ v2.6.1: Intelligent Bitrate Auto-Selection - Automatic bitrate calculation prevents corrupted macroblock errors. Bitrate now adapts to resolutionΓframerate (12 Mbps @ 720p/60fps, 25 Mbps @ 1080p/60fps). Visible in status endpoint!
β‘ v2.6 features: Intelligent Sensor Mode Auto-Selection - Camera Module 3 (IMX708) automatically selects optimal native sensor mode. 720p achieves 60fps (was 14fps) - 4.2x faster!
π v2.5 features: System Monitoring - Real-time Raspberry Pi health metrics including CPU temperature, WiFi signal quality, memory/disk usage, and throttling detection.
βΉοΈ v2.4 features: FOV mode selection - Choose between constant field of view (scale) or digital zoom effect (crop) for all resolutions.
βΉοΈ v2.3 features: Dynamic framerate control with intelligent clamping. Fixed critical race condition in concurrent camera reconfiguration.
βΉοΈ v2.2 features: Camera capabilities discovery endpoint to query supported resolutions, exposure/gain limits, and available features.
βΉοΈ v2.1 features: Exposure value (EV) compensation, noise reduction modes, advanced AE controls, AWB mode presets, autofocus trigger, dynamic resolution change.
βΉοΈ v2.0 features: Autofocus control, snapshot capture, manual AWB with NoIR presets, image processing, HDR support, ROI/digital zoom, day/night detection. See docs/upgrade-v2.md.
# Complete installation (see docs/installation.md for details)
./scripts/install-service.sh
# Test everything works
./scripts/test-api-v2.sh
# Access RTSP stream
# VLC: rtsp://<PI_IP>:8554/camπ Complete Documentation: See docs/installation.md for step-by-step installation.
This service runs on the Raspberry Pi, controls the camera (e.g., Raspberry Pi Camera Module 3 Wide NoIR), and exposes an HTTP REST API to:
- β Start/stop RTSP streaming to MediaMTX
- β Enable/disable auto-exposure
- β Set manual exposure (time + gain)
- β Enable/disable auto white balance (AWB)
- β Get current camera status (lux, exposure, gain, color temperature, etc.)
- β API key authentication (optional)
- β Auto-start at boot (systemd)
- β Comprehensive test suite
- Autofocus modes: manual, auto, continuous
- Manual lens position: 0.0 (infinity) to 15.0 (macro ~10cm)
- Autofocus range: normal, macro, full
- Endpoints:
POST /v1/camera/{autofocus_mode,lens_position,autofocus_range}
- Snapshot without stopping stream: Capture JPEG images on-demand
- Configurable resolution: Up to 4608Γ2592 (12MP)
- Auto-focus trigger: Optional autofocus before capture
- Base64 encoded output: Easy integration with web apps
- Endpoint:
POST /v1/camera/snapshot
- Manual AWB gains: Precise red/blue channel control
- NoIR-optimized presets:
daylight_noir- Outdoor/daylight with NoIR camerair_850nm- IR illumination at 850nm wavelengthir_940nm- IR illumination at 940nm wavelengthindoor_noir- Indoor lighting with NoIR camera
- Endpoints:
POST /v1/camera/{manual_awb,awb_preset}
- Brightness: -1.0 to 1.0 adjustment
- Contrast: 0.0 to 2.0 (1.0 = no change)
- Saturation: 0.0 to 2.0 (1.0 = no change)
- Sharpness: 0.0 to 16.0 (higher = sharper)
- Endpoint:
POST /v1/camera/image_processing
- Hardware HDR: From Camera Module 3 sensor
- Modes: off, auto, sensor, single-exp
- Endpoint:
POST /v1/camera/hdr
- Region of Interest: Crop and stream specific areas
- Normalized coordinates: 0.0-1.0 for resolution-independent control
- Hardware-accelerated: No performance impact
- Endpoint:
POST /v1/camera/roi
- Exposure limits: Constrain auto-exposure min/max values
- Prevent flicker: Useful for artificial lighting
- Maintain framerate: Limit max exposure time
- Endpoint:
POST /v1/camera/exposure_limits
- Lens correction: Distortion correction for wide-angle cameras (120Β° FOV)
- Image transform: Horizontal/vertical flip, rotation
- Endpoints:
POST /v1/camera/{lens_correction,transform}
- Automatic scene detection: day, low_light, night
- Configurable threshold: Lux-based switching
- Endpoint:
POST /v1/camera/day_night_mode
- 10 new status fields: autofocus_mode, lens_position, focus_fom, hdr_mode, scene_mode, and more
- Real-time monitoring: All metadata available via
GET /v1/camera/status
- Auto-detection: Tuning files for NoIR cameras
- Configuration variables:
CAMERA_CAMERA_MODEL,CAMERA_IS_NOIR,CAMERA_TUNING_FILE - Optimized presets: AWB presets specifically for NoIR imaging
- EV adjustment: -8.0 to +8.0 compensation
- Brightness control: Fine-tune auto-exposure target brightness
- Use cases: Backlit scenes, high-contrast situations
- Endpoint:
POST /v1/camera/exposure_value
- 5 modes: off, fast, high_quality, minimal, zsl
- Quality vs Performance: Balance noise reduction with processing speed
- Perfect for low-light: Compensate for high gain noise
- Endpoint:
POST /v1/camera/noise_reduction
- AE Constraint Mode: normal, highlight, shadows, custom
- Control how AE handles over/underexposure
- AE Exposure Mode: normal, short, long, custom
- Prioritize exposure time vs gain
- Endpoints:
POST /v1/camera/{ae_constraint_mode,ae_exposure_mode}
- 7 preset modes: auto, tungsten, fluorescent, indoor, daylight, cloudy, custom
- Optimized for lighting: Quick white balance adjustment
- Endpoint:
POST /v1/camera/awb_mode
- Manual trigger: Initiate autofocus scan on demand
- One-shot AF: Useful for manual/auto focus modes
- Endpoint:
POST /v1/camera/autofocus_trigger
- Change resolution: Adjust streaming resolution without service restart
- Seamless switching: Automatic stop/reconfigure/restart
- Common resolutions: 1920x1080, 1280x720, 640x480, 4K
- Endpoint:
POST /v1/camera/resolution
- Frame duration control: Uses
FrameDurationLimits(libcamera native) - Prevent flicker: Constrain min/max frame duration
- Maintains framerate: Limit exposure time for consistent FPS
- Endpoint:
POST /v1/camera/exposure_limits
- Ready-made scripts: One-command low-light configuration
- 3 modes: Static scenes, moving subjects, normal
- Scripts:
./scripts/set-low-light-mode.sh- Maximum visibility (static)./scripts/set-low-light-motion-mode.sh- Reduced blur (motion)./scripts/set-normal-mode.sh- Reset to defaults
- Documentation: See docs/low-light-modes.md
- Hardware discovery: Query camera sensor model, resolution, and supported features
- Exposure/gain limits: Get minimum and maximum values for exposure and gain
- Feature detection: Discover what controls are available (autofocus, HDR, etc.)
- Resolution support: List all supported streaming resolutions
- Essential for clients: Know what the camera supports before configuring
- Endpoint:
GET /v1/camera/capabilities
- Current limits: See active frame duration and exposure constraints
- Effective limits: Understand why certain exposure values may not apply
- Real-time constraints: Monitor hardware and configured limits
- Enhanced field:
current_limitsinGET /v1/camera/status
- Independent framerate adjustment: Change framerate without changing resolution
- Smart limit enforcement: Automatically clamps to hardware maximum for current resolution
- User-friendly: No rejected requests - API applies best available framerate
- Detailed feedback: Returns requested vs applied framerate with clamping indicator
- Resolution-aware limits:
- 4K (3840x2160): max 30 fps
- 1440p (2560x1440): max 40 fps
- 1080p (1920x1080): max 50 fps
- 720p (1280x720): max 120 fps
- VGA (640x480): max 120 fps
- Example: Request 500fps at 4K β API applies 30fps (the maximum) and indicates clamping occurred
- Endpoint:
POST /v1/camera/framerate
- Enhanced capabilities:
GET /v1/camera/capabilitiesnow includes:current_framerate: Currently configured frameratemax_framerate_for_current_resolution: Maximum fps for active resolutionframerate_limits_by_resolution: Complete table of max fps for each resolution
- Automatic camera type detection: Detects Camera Module 3 Wide (120Β° FOV) vs standard cameras (66Β° FOV)
- Detection via camera model name (e.g.,
imx708_wide_noir) - Graceful fallback to standard camera if detection fails
- Detection via camera model name (e.g.,
- Intelligent sensor mode selection: Preserves wide-angle field of view
- Wide cameras: Always use full sensor (Mode 0 or Mode 1) to maintain 120Β° FOV
- Standard cameras: Can use cropped modes (Mode 2) for higher framerates
- Critical fix: Wide cameras no longer lose FOV at lower resolutions
- New API fields in
GET /v1/camera/status:is_wide_camera: Boolean flag for camera type (true = 120Β° FOV)sensor_mode_width: Current sensor mode width being usedsensor_mode_height: Current sensor mode height being used
- New API fields in
GET /v1/camera/capabilities:is_wide_camera: Camera type detection resultfield_of_view_degrees: Field of view (120Β° or 66Β°)sensor_modes: Complete IMX708 sensor mode specificationsrecommended_resolutions: Resolution presets optimized for camera type- Wide cameras: Prioritize full FOV preservation
- Standard cameras: Prioritize framerate optimization
- Perfect for: Surveillance with wide coverage, dynamic client UIs, FOV-aware applications
Example - Query camera type and get recommendations:
const capabilities = await fetch('/v1/camera/capabilities').then(r => r.json());
if (capabilities.is_wide_camera) {
console.log(`Wide-angle camera: ${capabilities.field_of_view_degrees}Β° FOV`);
// Use recommended resolutions that preserve full 120Β° FOV
capabilities.recommended_resolutions.forEach(res => {
console.log(`${res.label}: ${res.width}x${res.height} @ ${res.max_fps}fps (${res.fov})`);
});
}Monitor your Raspberry Pi's health in real-time alongside camera operations:
-
CPU Temperature Monitoring
- Real-time CPU/GPU temperature in Celsius
- Status classification (normal/warm/hot/critical)
- Thermal throttling detection
- Critical for long-running video encoding
-
CPU & Memory Usage
- CPU usage percentage
- Load average (1min, 5min, 15min)
- RAM usage (total, used, available)
- Memory percentage
-
WiFi Signal Quality
- Signal strength in dBm
- Quality percentage (0-100%)
- Status classification (excellent/good/fair/weak)
- Active network interface detection
- Perfect for remote camera deployments
-
Network Statistics
- Bytes sent/received
- Packets sent/received
- Network interface info (wlan0/eth0)
- Monitor bandwidth usage
-
Disk Usage
- Total/used/free space in GB
- Usage percentage
- Prevent storage issues during recording
-
System Uptime
- System uptime in seconds/days
- Service uptime tracking
- Monitor stability
-
Raspberry Pi Throttling Detection
- Under-voltage detection
- Frequency capping detection
- Temperature-based throttling
- Historical throttling events
- Essential for power supply diagnostics
Endpoint: GET /v1/system/status
Example Response:
{
"temperature": {"cpu_c": 50.7, "status": "normal"},
"cpu": {"usage_percent": 32.5, "load_average": {...}, "cores": 4},
"memory": {"total_mb": 16219, "percent": 8.4},
"network": {
"wifi": {"signal_dbm": -70, "quality_percent": 60, "status": "fair"},
"interface": "wlan0",
"bytes_sent": 18863322109,
"bytes_received": 11251936223
},
"disk": {"total_gb": 234.6, "percent": 1.7},
"uptime": {"service_seconds": 8.3, "system_days": 0.2},
"throttled": {
"currently_throttled": false,
"under_voltage_detected": false,
"has_occurred": false
}
}Use Cases:
- Monitor temperature during intensive video encoding
- Detect WiFi signal degradation affecting streaming quality
- Alert on thermal throttling or under-voltage issues
- Track resource usage for optimization
- Verify system stability for 24/7 deployments
Remote access to service logs for debugging and monitoring without SSH access:
-
Log Retrieval with Filtering
- Query recent logs (1-10,000 lines, default: 100)
- Filter by log level (INFO, WARNING, ERROR)
- Search by keyword/pattern
- Returns JSON with log lines and metadata
- Endpoint:
GET /v1/system/logs
-
Real-Time Log Streaming (SSE)
- Server-Sent Events for continuous log monitoring
- Same filtering capabilities as retrieval endpoint
- Auto-cleanup on client disconnect
- Perfect for live debugging dashboards
- Endpoint:
GET /v1/system/logs/stream
Example - Get last 50 logs:
curl "http://<PI_IP>:8000/v1/system/logs?lines=50"Example - Filter ERROR logs:
curl "http://<PI_IP>:8000/v1/system/logs?level=ERROR&lines=100"Example - Search for specific events:
curl "http://<PI_IP>:8000/v1/system/logs?search=resolution&lines=200"Example - Real-time streaming (JavaScript):
const eventSource = new EventSource('http://<PI_IP>:8000/v1/system/logs/stream?level=ERROR');
eventSource.onmessage = (event) => {
console.log('Log:', event.data);
// Update UI with new log entries
};Response Format:
{
"logs": [
"Nov 23 17:41:20 picam pi-camera-service[21852]: 2025-11-23 17:41:20,623 - camera_service.api - INFO - === Pi Camera Service Starting ===",
"Nov 23 17:41:20 picam pi-camera-service[21852]: 2025-11-23 17:41:20,754 - camera_service.streaming_manager - INFO - Starting RTSP streaming to rtsp://127.0.0.1:8554/cam"
],
"total_lines": 2,
"service": "pi-camera-service"
}Use Cases:
- Remote debugging without SSH access
- Monitor service health and errors in real-time
- Build monitoring dashboards with live log feeds
- Troubleshoot issues from web/mobile apps
- Audit camera operations and configuration changes
The video stream is published to MediaMTX, which then serves it via RTSP / WebRTC / HLS.
Pi Camera v3 ββ> Picamera2/libcamera ββ> H.264 encoder ββ> MediaMTX (RTSP, WebRTC, HLS)
β² β²
β β
Pi Camera Service API (FastAPI) β
β² β
External App (backend, UI) βββββ
Components:
- Pi Camera Service: This project, running on the Pi
- Picamera2: Python library for controlling libcamera
- MediaMTX: Multi-protocol streaming server
- External Application: Consumes stream via MediaMTX and controls camera via HTTP
Technologies:
- FastAPI with modern lifespan context manager
- Pydantic BaseSettings for type-safe configuration
- Threading with RLock for thread-safety
- Structured logging
- pytest tests + integration tests
- Raspberry Pi (Pi 4 or Pi 5 recommended for H.264 encoding)
- libcamera-compatible camera (e.g., Raspberry Pi Camera Module 3)
- Raspberry Pi OS (Bookworm or later)
- Python 3.9+
- MediaMTX installed and configured
Follow the complete guide in docs/installation.md:
# 1. Install system dependencies
sudo apt update
sudo apt install -y python3-venv python3-picamera2 python3-libcamera libcamera-apps ffmpeg git
# 2. Clone the project
git clone https://github.com/gmathy2104/pi-camera-service.git ~/pi-camera-service
cd ~/pi-camera-service
# 3. Create virtual environment (IMPORTANT: with --system-site-packages)
python3 -m venv --system-site-packages venv
source venv/bin/activate
# 4. Install dependencies
pip install --upgrade pip
pip install -r requirements.txt
# 5. Install systemd service
./install-service.sh
β οΈ Important: The virtual environment MUST be created with--system-site-packagesto access picamera2 which is installed via APT.
The service uses environment variables with the CAMERA_ prefix.
Create a .env file (optional):
cp .env.example .env
nano .envMain variables:
# Video resolution and quality
CAMERA_WIDTH=1920
CAMERA_HEIGHT=1080
CAMERA_FRAMERATE=30
CAMERA_BITRATE=8000000
# API server
CAMERA_HOST=0.0.0.0
CAMERA_PORT=8000
# Authentication (optional)
CAMERA_API_KEY=your-secret-key
# MediaMTX RTSP URL
CAMERA_RTSP_URL=rtsp://127.0.0.1:8554/cam
# Camera hardware (v2.0)
CAMERA_CAMERA_MODEL=imx708 # Camera sensor model
CAMERA_IS_NOIR=false # True for NoIR cameras
# Logging
CAMERA_LOG_LEVEL=INFOIn mediamtx.yml, declare the cam path as publisher:
paths:
cam:
source: publisher
β οΈ DO NOT usesource: rpiCamera(conflicts with this service)
cd ~/pi-camera-service
source venv/bin/activate
python main.pyThe API will be available at http://0.0.0.0:8000
# Start
sudo systemctl start pi-camera-service
# Stop
sudo systemctl stop pi-camera-service
# Restart
sudo systemctl restart pi-camera-service
# View logs
sudo journalctl -u pi-camera-service -fπ See docs/installation.md for complete service documentation.
Base URL: http://<PI_IP>:8000
GET /health
{
"status": "healthy",
"camera_configured": true,
"streaming_active": true,
"version": "2.5.0"
}GET /v1/camera/status
{
"lux": 45.2,
"exposure_us": 12000,
"analogue_gain": 1.5,
"colour_temperature": 4200.0,
"auto_exposure": true,
"streaming": true,
// New v2.0 fields
"autofocus_mode": "continuous",
"lens_position": 2.5,
"focus_fom": 12500,
"hdr_mode": "off",
"lens_correction_enabled": true,
"scene_mode": "day",
"day_night_mode": "auto",
"day_night_threshold_lux": 10.0,
"frame_duration_us": 33321,
"sensor_black_levels": [4096, 4096, 4096, 4096],
// New v2.2 field
"current_limits": {
"frame_duration_us": 33321,
"frame_duration_limits_us": null,
"effective_exposure_limit_us": {
"min": 100,
"max": 1000000
}
},
// New v2.7 fields (wide-angle camera support)
"is_wide_camera": true,
"sensor_mode_width": 2304,
"sensor_mode_height": 1296,
// New v2.8 fields (advanced controls tracking)
"exposure_value": 0.0,
"noise_reduction_mode": "off",
"ae_constraint_mode": "normal",
"ae_exposure_mode": "normal"
}GET /v1/camera/capabilities
Discover camera hardware capabilities and supported features.
{
"sensor_model": "imx708",
"sensor_resolution": {
"width": 4608,
"height": 2592
},
"supported_resolutions": [
{"width": 640, "height": 480, "label": "VGA"},
{"width": 1280, "height": 720, "label": "720p"},
{"width": 1920, "height": 1080, "label": "1080p"},
{"width": 2560, "height": 1440, "label": "1440p"},
{"width": 3840, "height": 2160, "label": "4K"}
],
"exposure_limits_us": {"min": 100, "max": 1000000},
"gain_limits": {"min": 1.0, "max": 16.0},
"lens_position_limits": {"min": 0.0, "max": 15.0},
"exposure_value_range": {"min": -8.0, "max": 8.0},
"supported_noise_reduction_modes": ["off", "fast", "high_quality", "minimal", "zsl"],
"supported_ae_constraint_modes": ["normal", "highlight", "shadows", "custom"],
"supported_ae_exposure_modes": ["normal", "short", "long", "custom"],
"supported_awb_modes": ["auto", "tungsten", "fluorescent", "indoor", "daylight", "cloudy", "custom"],
"features": [
"auto_exposure", "manual_exposure", "auto_white_balance", "manual_white_balance",
"exposure_value_compensation", "noise_reduction", "ae_constraint_modes",
"ae_exposure_modes", "awb_modes", "image_processing", "roi_digital_zoom",
"exposure_limits", "autofocus", "lens_position_control", "autofocus_trigger"
],
"current_framerate": 30.0,
"max_framerate_for_current_resolution": 50.0,
"framerate_limits_by_resolution": [
{"width": 3840, "height": 2160, "label": "4K", "max_fps": 30.0},
{"width": 2560, "height": 1440, "label": "1440p", "max_fps": 40.0},
{"width": 1920, "height": 1080, "label": "1080p", "max_fps": 50.0},
{"width": 1280, "height": 720, "label": "720p", "max_fps": 120.0},
{"width": 640, "height": 480, "label": "VGA", "max_fps": 120.0}
],
// New v2.7 fields (wide-angle camera support)
"is_wide_camera": true,
"field_of_view_degrees": 120,
"sensor_modes": {
"mode_0": {"width": 4608, "height": 2592, "max_fps": 14.35, "description": "Full sensor, no binning"},
"mode_1": {"width": 2304, "height": 1296, "max_fps": 56.03, "description": "Full sensor with 2x2 binning"},
"mode_2": {"width": 1536, "height": 864, "max_fps": 120.13, "description": "Cropped sensor with 2x2 binning"}
},
"recommended_resolutions": [
{"width": 2304, "height": 1296, "label": "Native Mode 1", "max_fps": 56, "sensor_mode": "mode_1", "fov": "Full 120Β°"},
{"width": 1920, "height": 1080, "label": "1080p (Full FOV)", "max_fps": 56, "sensor_mode": "mode_1", "fov": "Full 120Β°"},
{"width": 1280, "height": 720, "label": "720p (Full FOV)", "max_fps": 56, "sensor_mode": "mode_1", "fov": "Full 120Β°"},
{"width": 4608, "height": 2592, "label": "4K (Full sensor)", "max_fps": 14, "sensor_mode": "mode_0", "fov": "Full 120Β°"}
]
}Choose between constant field of view or digital zoom effect across all resolutions.
GET /v1/camera/fov_mode
Query current FOV mode.
{
"mode": "scale",
"description": "Full sensor readout with downscaling β Constant field of view"
}POST /v1/camera/fov_mode
Change FOV mode.
{"mode": "scale"} // or "crop"Modes:
-
scale(default): Constant field of view at all resolutions- Reads full sensor area (4608x2592 for IMX708)
- Hardware ISP downscales to target resolution
- Better image quality from downsampling
- Perfect for surveillance, monitoring, consistent framing
-
crop: Digital zoom effect (sensor crop)- Reads only required sensor area for target resolution
- FOV reduces at lower resolutions (telephoto effect)
- Lower processing load, faster readout
- Useful for zoom/telephoto applications
Example - Set FOV mode with resolution:
POST /v1/camera/resolution
{
"width": 1280,
"height": 720,
"fov_mode": "crop", // Optional: change mode simultaneously
"restart_streaming": true
}POST /v1/camera/auto_exposure
{"enabled": true}POST /v1/camera/manual_exposure
{
"exposure_us": 20000,
"gain": 2.0
}POST /v1/camera/awb
{"enabled": false}POST /v1/camera/manual_awb (v2.0)
{
"red_gain": 1.5,
"blue_gain": 1.8
}POST /v1/camera/awb_preset (v2.0)
{"preset": "daylight_noir"}POST /v1/camera/autofocus_mode
{"mode": "continuous"} // manual, auto, continuousPOST /v1/camera/lens_position
{"position": 5.0} // 0.0 = infinity, 10.0 = ~10cmPOST /v1/camera/autofocus_range
{"range_mode": "normal"} // normal, macro, fullPOST /v1/camera/snapshot
{
"width": 1920,
"height": 1080,
"autofocus_trigger": true
}Response:
{
"status": "ok",
"image_base64": "base64_encoded_jpeg_data...",
"width": 1920,
"height": 1080
}POST /v1/camera/image_processing
{
"brightness": 0.1, // -1.0 to 1.0
"contrast": 1.2, // 0.0 to 2.0
"saturation": 1.0, // 0.0 to 2.0
"sharpness": 8.0 // 0.0 to 16.0
}POST /v1/camera/hdr
{"mode": "sensor"} // off, auto, sensor, single-expPOST /v1/camera/roi
{
"x": 0.25, // X offset (0.0-1.0)
"y": 0.25, // Y offset (0.0-1.0)
"width": 0.5, // Width (0.0-1.0)
"height": 0.5 // Height (0.0-1.0)
}POST /v1/camera/framerate
Change camera framerate dynamically with intelligent clamping. The API automatically applies the hardware maximum for your current resolution, ensuring a user-friendly experience without rejected requests.
Request:
{
"framerate": 60.0, // Desired framerate (1-1000 fps)
"restart_streaming": true // Restart streaming after change (default: true)
}Response:
{
"status": "ok",
"requested_framerate": 60.0,
"applied_framerate": 50.0, // Actual framerate applied (may be clamped)
"max_framerate_for_resolution": 50.0,
"resolution": "1920x1080",
"clamped": true // Indicates if framerate was clamped to max
}Example - Requesting high framerate at 4K:
# Request 500fps at 4K resolution
curl -X POST http://raspberrypi:8000/v1/camera/framerate \
-H "Content-Type: application/json" \
-d '{"framerate": 500}'
# Response: API automatically clamps to 30fps (4K maximum)
# {
# "status": "ok",
# "requested_framerate": 500.0,
# "applied_framerate": 30.0,
# "max_framerate_for_resolution": 30.0,
# "resolution": "3840x2160",
# "clamped": true
# }Resolution-based framerate limits:
- 4K (3840x2160): max 30 fps
- 1440p (2560x1440): max 40 fps
- 1080p (1920x1080): max 50 fps
- 720p (1280x720): max 120 fps
- VGA (640x480): max 120 fps
POST /v1/streaming/start
POST /v1/streaming/stop
GET /v1/system/status
Get comprehensive Raspberry Pi health metrics:
{
"temperature": {
"cpu_c": 50.7,
"status": "normal"
},
"cpu": {
"usage_percent": 32.5,
"load_average": {
"1min": 0.88,
"5min": 0.62,
"15min": 0.56
},
"cores": 4
},
"memory": {
"total_mb": 16219.1,
"used_mb": 1364.6,
"available_mb": 14854.5,
"percent": 8.4
},
"network": {
"bytes_sent": 18863322109,
"bytes_received": 11251936223,
"wifi": {
"signal_dbm": -70,
"quality_percent": 60,
"status": "fair"
},
"interface": "wlan0"
},
"disk": {
"total_gb": 234.6,
"used_gb": 3.8,
"free_gb": 218.9,
"percent": 1.7
},
"uptime": {
"service_seconds": 8.3,
"system_seconds": 13949.0,
"system_days": 0.2
},
"throttled": {
"currently_throttled": false,
"under_voltage_detected": false,
"frequency_capped": false,
"has_occurred": false
}
}π Complete API Documentation: See docs/api-reference.md
Test all v2.0 features:
# Service must be running
./test-api-v2.shExpected output:
========================================
β All v2.0 API tests passed!
========================================
# Basic API test (v1.0 endpoints)
./test-api.sh
# Unit tests
pytest tests/ --ignore=tests/test_api_integration.py
# Integration tests (service must be running)
pytest tests/test_api_integration.py -v
# All tests
pytest tests/ -vπ See docs/development.md for complete testing guide.
# Get status with v2.0 metadata
curl http://raspberrypi:8000/v1/camera/status
# Set autofocus to continuous mode
curl -X POST http://raspberrypi:8000/v1/camera/autofocus_mode \
-H "Content-Type: application/json" \
-d '{"mode": "continuous"}'
# Capture a snapshot
curl -X POST http://raspberrypi:8000/v1/camera/snapshot \
-H "Content-Type: application/json" \
-d '{"width": 1920, "height": 1080}' \
| jq -r '.image_base64' | base64 -d > snapshot.jpg
# Set manual white balance (NoIR daylight preset)
curl -X POST http://raspberrypi:8000/v1/camera/awb_preset \
-H "Content-Type: application/json" \
-d '{"preset": "daylight_noir"}'
# Adjust image processing
curl -X POST http://raspberrypi:8000/v1/camera/image_processing \
-H "Content-Type: application/json" \
-d '{"brightness": 0.1, "contrast": 1.2, "sharpness": 10.0}'
# Set ROI (center crop)
curl -X POST http://raspberrypi:8000/v1/camera/roi \
-H "Content-Type: application/json" \
-d '{"x": 0.25, "y": 0.25, "width": 0.5, "height": 0.5}'
# Change framerate (v2.3) - intelligent clamping
curl -X POST http://raspberrypi:8000/v1/camera/framerate \
-H "Content-Type: application/json" \
-d '{"framerate": 60}'
# With authentication (if CAMERA_API_KEY is set)
curl -H "X-API-Key: your-key" \
http://raspberrypi:8000/v1/camera/statusimport requests
import base64
from pathlib import Path
BASE_URL = "http://raspberrypi:8000"
HEADERS = {"X-API-Key": "your-key"} # If auth enabled
# Get enhanced status with v2.0 metadata
response = requests.get(f"{BASE_URL}/v1/camera/status", headers=HEADERS)
status = response.json()
print(f"Autofocus: {status['autofocus_mode']}, Scene: {status['scene_mode']}")
print(f"Lux: {status['lux']}, Focus FoM: {status['focus_fom']}")
# Set autofocus mode
requests.post(
f"{BASE_URL}/v1/camera/autofocus_mode",
json={"mode": "continuous"},
headers=HEADERS
)
# Capture snapshot and save to file
response = requests.post(
f"{BASE_URL}/v1/camera/snapshot",
json={"width": 1920, "height": 1080, "autofocus_trigger": True},
headers=HEADERS
)
snapshot_data = response.json()
image_bytes = base64.b64decode(snapshot_data['image_base64'])
Path("snapshot.jpg").write_bytes(image_bytes)
print(f"Snapshot saved: {snapshot_data['width']}x{snapshot_data['height']}")
# Set manual AWB for NoIR camera
requests.post(
f"{BASE_URL}/v1/camera/awb_preset",
json={"preset": "daylight_noir"},
headers=HEADERS
)
# Adjust image processing
requests.post(
f"{BASE_URL}/v1/camera/image_processing",
json={
"brightness": 0.1,
"contrast": 1.2,
"saturation": 1.0,
"sharpness": 8.0
},
headers=HEADERS
)
# Change framerate (v2.3) with intelligent clamping
response = requests.post(
f"{BASE_URL}/v1/camera/framerate",
json={"framerate": 60},
headers=HEADERS
)
result = response.json()
if result['clamped']:
print(f"Framerate clamped: {result['requested_framerate']}fps β {result['applied_framerate']}fps")
print(f"Max for {result['resolution']}: {result['max_framerate_for_resolution']}fps")
else:
print(f"Framerate set to {result['applied_framerate']}fps")const BASE_URL = "http://raspberrypi:8000";
const headers = {
"Content-Type": "application/json",
"X-API-Key": "your-key" // If auth enabled
};
// Get enhanced status
const response = await fetch(`${BASE_URL}/v1/camera/status`, { headers });
const status = await response.json();
console.log(`Autofocus: ${status.autofocus_mode}, Scene: ${status.scene_mode}`);
// Set autofocus mode
await fetch(`${BASE_URL}/v1/camera/autofocus_mode`, {
method: "POST",
headers,
body: JSON.stringify({ mode: "continuous" })
});
// Capture snapshot
const snapshotRes = await fetch(`${BASE_URL}/v1/camera/snapshot`, {
method: "POST",
headers,
body: JSON.stringify({ width: 1920, height: 1080 })
});
const { image_base64 } = await snapshotRes.json();
// Convert base64 to blob for download or display
const blob = await fetch(`data:image/jpeg;base64,${image_base64}`).then(r => r.blob());
// Set manual AWB
await fetch(`${BASE_URL}/v1/camera/manual_awb`, {
method: "POST",
headers,
body: JSON.stringify({ red_gain: 1.5, blue_gain: 1.8 })
});rpicam-hello --list-camerasIf no camera appears, check cable and connection.
# View error logs
sudo journalctl -u pi-camera-service -n 50
# Check status
sudo systemctl status pi-camera-service
# Test manually
cd ~/pi-camera-service
source venv/bin/activate
python main.pyRecreate venv with --system-site-packages:
cd ~/pi-camera-service
rm -rf venv
python3 -m venv --system-site-packages venv
source venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt- Check service is running:
curl http://localhost:8000/health - Check MediaMTX:
sudo systemctl status mediamtx - View logs:
sudo journalctl -u pi-camera-service -f
Some libcamera versions don't support ExposureTimeMin/Max controls. This is a platform limitation, not a bug. The endpoint will fail gracefully with a clear error message.
π See docs/installation.md for more solutions.
| Document | Description |
|---|---|
| README.md | This file - project overview |
| docs/installation.md | Complete installation and setup guide |
| docs/api-reference.md | Full REST API documentation |
| docs/configuration.md | Configuration options and examples |
| docs/development.md | Development guide and testing |
| docs/upgrade-v2.md | Migration guide to v2.0 |
| CONTRIBUTING.md | How to contribute to the project |
| CHANGELOG.md | Version history and release notes |
| LICENSE | MIT License |
pi-camera-service/
βββ camera_service/
β βββ __init__.py
β βββ api.py # FastAPI app with modern lifespan
β βββ camera_controller.py # Thread-safe camera control
β βββ streaming_manager.py # H.264 streaming management
β βββ config.py # Pydantic configuration
β βββ exceptions.py # Custom exceptions
βββ tests/
β βββ test_api.py # API tests (mocked)
β βββ test_api_integration.py # Integration tests (live API)
β βββ test_camera_controller.py
β βββ test_config.py
β βββ test_streaming_manager.py
βββ main.py # Entry point
βββ requirements.txt # Production dependencies
βββ requirements-dev.txt # Development dependencies
βββ .env.example # Configuration template
βββ test-api.sh # v1.0 test script
βββ test-api-v2.sh # v2.0 test script (NEW)
βββ install-service.sh # Service installation
βββ pi-camera-service.service # systemd file
βββ CHANGELOG.md # Version history (NEW)
βββ VERSION # Version number (NEW)
βββ UPGRADE_v2.md # v2.0 upgrade guide (NEW)
See CHANGELOG.md for detailed version history.
Major Features:
- β Autofocus control (modes, lens position, range)
- β Snapshot capture (JPEG, base64 encoded)
- β Manual white balance + NoIR presets
- β Image processing (brightness, contrast, saturation, sharpness)
- β HDR support (hardware + software modes)
- β ROI / Digital zoom
- β Exposure limits
- β Lens correction for wide-angle cameras
- β Image transform (flip/rotation)
- β Day/night detection
- β NoIR camera optimization
- β Enhanced metadata (10 new status fields)
14 new endpoints, 1200+ lines of code, 100% backward compatible with v1.0
See UPGRADE_v2.md for complete upgrade guide.
Initial production release:
- FastAPI-based HTTP API
- RTSP streaming to MediaMTX
- Auto/manual exposure control
- Auto white balance control
- Camera status endpoint
- API key authentication
- systemd service support
- Comprehensive test suite
# Set day/night auto-detection
curl -X POST http://raspberrypi:8000/v1/camera/day_night_mode \
-H "Content-Type: application/json" \
-d '{"mode": "auto", "threshold_lux": 10.0}'
# Apply NoIR IR preset
curl -X POST http://raspberrypi:8000/v1/camera/awb_preset \
-H "Content-Type: application/json" \
-d '{"preset": "ir_850nm"}'import requests
import time
for i in range(100):
response = requests.post(
"http://raspberrypi:8000/v1/camera/snapshot",
json={"width": 4608, "height": 2592, "autofocus_trigger": True}
)
# Save snapshot...
time.sleep(60) # Every minute# Capture snapshot for processing
snapshot = requests.post(
"http://raspberrypi:8000/v1/camera/snapshot",
json={"width": 640, "height": 480}
).json()
# Decode and process with OpenCV/TensorFlow
image = base64.b64decode(snapshot['image_base64'])
# ... ML processing ...MIT License - See LICENSE file for details.
See docs/development.md for development guide and CONTRIBUTING.md for contribution guidelines.
To contribute:
- Fork the project
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
If you encounter issues:
- Check docs/development.md - Run
./scripts/test-api-v2.sh - View logs:
sudo journalctl -u pi-camera-service -f - Check docs/installation.md - Troubleshooting section
- Open an issue on GitHub
- Raspberry Pi Foundation for Camera Module 3 and libcamera
- FastAPI team for the excellent framework
- MediaMTX for versatile streaming server
- Community contributors
Built with β€οΈ for Raspberry Pi
π€ Enhanced with Claude Code