Skip to content

ahmadrezarazian/Multi-Camera-Simulation-Engine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

14 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Multi-Camera Simulation Engine

Python 3.8+ OpenGL 3.3+ Flask 2.0+ License: MIT

A high-performance, Python-based offscreen 3D camera simulation engine. It renders a virtual scene with a rotating camera using OpenGL and publishes the result as a real-time MJPEG IP camera stream, designed specifically for imaging system simulation experiments and computer vision testing.

Multi-Camera Simulation Demo


πŸ“Œ Overview

The Multi-Camera Simulation Engine provides a lightweight virtual environment for researchers and developers to simulate complex imaging systems. By leveraging thread-safe offscreen rendering and MJPEG streaming, it provides a "virtual" IP camera feed that can be consumed by standard video clients or computer vision pipelines without requiring a physical camera or a visible UI.

🌟 Project Vision & Background

This project was conceived to demonstrate the feasibility of high-fidelity imaging system simulation tools, particularly for consumer hardware products. It aligns with advanced industry workflows involving:

  • System Design & Validation: Defining simulation capabilities and performance goals to validate against real hardware.
  • Multi-Camera R&D: Investigating imaging performance goals and spatial configurations for multi-camera systems (including XR applications).
  • Cross-Functional Collaboration: Bridging the gap between Product Design (PD), Electrical Engineering (EE), Camera Hardware, and Software teams through early-stage virtual prototyping.

This repository represents the beginning of a medium-to-large scale open-source project aimed at providing accessible, professional-grade simulation tools for the computer vision and hardware engineering communities.

Key Applications

  • Computer Vision Development: Validate tracking and detection algorithms against ground-truth controlled virtual environments.
  • Imaging System R&D: Simulate varying camera parameters (FOV, resolution, placement) before physical deployment.
  • Robotics & XR Simulation: Provide low-latency visual feedback for autonomous agent training and augmented reality testing.

πŸš€ Key Features

  • Multi-Camera Support (v1.3.1): Simultaneously render and stream from multiple virtual cameras, each with independent configurations and unique MJPEG stream ports.
  • Real-time 3D Rendering: High-performance rendering via OpenGL 3.3+ with Phong shading (ambient, diffuse, specular) and directional lighting.
  • Thread-Safe Offscreen Pipeline: Uses hidden GLFW windows and custom Framebuffer Objects (FBOs) for background rendering, optimized for multi-threaded Flask environments.
  • MJPEG IP Camera Stream: Efficiently broadcasts rendered frames over HTTP, mimicking a real network-attached camera.
  • Dynamic Orbit Camera: Configurable virtual camera with adjustable FOV, near/far planes, and procedural orbit logic.
  • Procedural Scene Elements: Includes a built-in ground plane with checkerboard textures for spatial reference and a central cube.
  • Modular Architecture: Clean separation between core logic, rendering, streaming, and web interfaces for easy extensibility.

πŸ“ Project Structure

Multi-Camera-Simulation-Engine/
β”œβ”€β”€ config/
β”‚   └── settings.json
β”œβ”€β”€ core/
β”‚   β”œβ”€β”€ app.py
β”‚   └── state.py
β”œβ”€β”€ doc/
β”‚   β”œβ”€β”€ image003_1.png
β”‚   β”œβ”€β”€ image003_2.png
β”‚   └── video003_01.mp4
β”œβ”€β”€ doc.me/
β”œβ”€β”€ effects/
β”‚   └── image_effects.py
β”œβ”€β”€ render/
β”‚   β”œβ”€β”€ camera.py
β”‚   └── renderer.py
β”œβ”€β”€ stream/
β”‚   └── mjpeg_stream.py
β”œβ”€β”€ utils/
β”‚   └── helpers.py
β”œβ”€β”€ web/
β”‚   β”œβ”€β”€ routes.py
β”‚   β”œβ”€β”€ static/
β”‚   └── templates/
β”‚       └── index.html
β”œβ”€β”€ LICENSE
β”œβ”€β”€ main.py
β”œβ”€β”€ README.md
β”œβ”€β”€ repo_info.txt
└── requirements.txt

πŸ› οΈ Installation & Setup

Prerequisites

  • Python 3.8+
  • OpenGL 3.3+ compatible graphics hardware
  • GLFW library (usually handled via glfw python package, but requires system-level OpenGL drivers)

1. Clone the Repository

git clone https://github.com/your-username/Multi-Camera-Simulation-Engine.git
cd Multi-Camera-Simulation-Engine

2. Prepare Environment

# Create and activate virtual environment
python -m venv venv
source venv/bin/activate  # Linux/macOS
# or
venv\Scripts\activate     # Windows

3. Install Dependencies

pip install -r requirements.txt

πŸš€ Quick Start

Launch the engine with a single command:

python main.py

Once running, the engine starts a local Flask server:

(Note: Each additional camera is assigned a unique port starting from 5001, while the main dashboard remains on 5000.)

Live Stream Preview

When you access the direct stream or the web dashboard, you will see the real-time offscreen rendered view. Version 1.3.1 introduces multi-camera support, allowing for independent streams from multiple virtual cameras:

Multi-Camera Stream Screenshot (Note: Example of multiple independent camera feeds served via MJPEG over HTTP)

🧠 Technical Architecture

The Multi-Camera Simulation Engine is built around a modular architecture to ensure scalability:

  • MultiCamSimApp (core/app.py): The central orchestrator. It initializes the AppState, Renderer, and MjpegStreamer and registers the web routes.
  • AppState (core/state.py): Manages global application state and loads configuration from settings.json.
  • Renderer (render/renderer.py): The heart of the 3D visualization. It handles:
    • GLFW Context Management: Initializes a hidden GLFW window and manages the OpenGL context, crucial for thread-safe rendering in a multi-threaded Flask environment.
    • Shader Program: Compiles GLSL vertex and fragment shaders for Phong shading, enabling realistic lighting (ambient, diffuse, specular) and object coloring (including a checkerboard ground plane).
    • Geometry Buffers: Sets up Vertex Array Objects (VAOs), Vertex Buffer Objects (VBOs), and Element Buffer Objects (EBOs) for rendering 3D objects.
    • Framebuffer Object (FBO): Renders directly to an off-screen framebuffer, allowing the rendered image to be read back into a NumPy array without displaying a window.
  • MjpegStreamer (stream/mjpeg_stream.py): Responsible for encoding rendered frames into MJPEG format and serving them over HTTP using OpenCV and Flask Response.

🚧 Limitations & Future Work

Current Limitations

  • Basic Scene: The 3D scene is currently limited to a cube and a ground plane.
  • Passive Interface: The web dashboard is read-only; no interactive controls for the camera are available yet.

Future Enhancements

  • Model Loading: Support for loading complex 3D models (OBJ, GLTF).
  • Interactive UI: Real-time controls for camera movement, FOV, and lighting parameters via the web dashboard.
  • Post-Processing: Integration of image effects (noise, lens distortion, color correction).
  • Plugin System: Formal plugin system for adding new rendering effects or camera types.
  • REST API: Programmatic control of simulation parameters via a dedicated API.

🀝 Contributing

Contributions are welcome! Whether it's adding support for multiple cameras, improving the UI, or implementing new post-processing effects.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

πŸ“„ License

Distributed under the MIT License. See LICENSE for more information.

Author

Sayed Ahmadreza Razian, PhD

LinkedIn
https://www.linkedin.com/in/ahmadrezarazian/

Google Scholar
https://scholar.google.com/citations?user=Dh9Iy2YAAAAJ

Email
AhmadrezaRazian@gmail.com

Feel free to contact me for collaboration or questions.


Developed for advanced imaging system simulation and computer vision research.

About

A Python-based offscreen 3D camera simulation project that renders a simple scene with a rotating virtual camera and publishes the result as a fake IP camera stream for imaging system simulation experiments.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors