A framework for running a model on a Raspberry Pi to detect people and stop a drone if a person is detected to prevent the drone from flying over them. Tested on Arch Linux and Debian/Raspbian, but the docker version should run on any amd64/arm64 platform that supports docker.
All python source code is licensed under the MIT License
To convert the YOLO model to tflite for use in the framework, use [https://github.com/NathanCodeGitHub/Updated-Python-3.12-Ultralytics-PyTorch-To-TensorFlow-Lite-Converter].
- Install Picamera 2 using
apt install libcamera - Connect desired camera module to Raspberry Pi
- Ensure the camera is functioning using
libcamera-hello - Enable UART0 and 3 using
dtoverlayto have a connection to both the drone's telemetry connection as well as a telemetry transceiver module.dtoverlay=uart0-pi5anddtoverlay=uart3-pi5- Modify
/boot/firmware/config.txtto make these changes permanent.
- Ensure connection between the Pi and both the device receiving the telemetry and the drone works (same baud rate, correct connection pins).
- You may need to create a cable of your own, connecting the TX/RX/Ground pins of the telemetry module and drone telemetry port to the respective GPIO pins.
- GPIO 14 and 15 are TX/RX for UART0/ttyAMA0, and GPIO 9 and 10 are TX/RX for UART3/ttyAMA3. You can use other UARTs as long as you specify the correct tty when running the script.
- Make sure you do NOT connect the telemetry port to a 5v pin on the Pi, as this might overload one or both devices.
- Run [prepare-pi.sh] to create a python virtual environment, install the necessary dependencies, and run the module.
- Telemetry should be available via MAVLink if you connect the other end of the telemetry module to a computer, for use in mission planner and similar.
- An object detection live feed should be available at the Pi's IP address on its network connection, on port 8080.
-
Perform steps 2-5 to prepare the hardware.
-
Run
apt install docker.ioto install docker. -
Clone this project and run [docker-prepare-pi.sh]
-
OR, run the following command to use the prebuilt image:
docker run -t \ -p 8080:8080 \ --device /dev/video0:/dev/video0 \ -e DRONE_PORT=/dev/ttyAMA3 \ -e DRONE_OUTPUT=/dev/ttyAMA0 \ ghcr.io/bluejaycoder/model:main
-
The live feed will also be available on port 8080 on the Pi's IP address.
-
Run
apt install docker.ioto install docker (or otherwise install docker according to your system). -
Run [docker-prepare.sh] to build and run with sane defaults, OR
-
Run the following commands to build and then run the container
docker build . -t bluejaycoder/model:main docker run -t \ -p 8080:8080 \ --device /dev/video0:/dev/video0 \ -e DRONE_PORT= \ -e DRONE_OUTPUT= \ bluejaycoder/model:main -
OR, run this to pull the prebuilt image:
docker run -t \ -p 8080:8080 \ --device /dev/video0:/dev/video0 \ -e DRONE_PORT= \ -e DRONE_OUTPUT= \ ghcr.io/bluejaycoder/model:main
-
The live feed will be available at http://127.0.0.1:8080.