CS 5023: Intro to Intelligent Robotics
This project simulates a TurtleBot 2 in a Gazebo world (room + hallway) using ROS Melodic.
The robot demonstrates reactive control behaviors and simultaneously performs occupancy grid mapping with slam_gmapping.
Behaviors implemented (priority order):
- Halt – Stop when bumpers detect collision.
- Keyboard teleoperation – Accept manual
/cmd_velinputs when safe. - Escape – If symmetric obstacles (< 1 ft ahead) are detected, back up + turn ~180° (±30°).
- Avoid – Turn away from asymmetric obstacles within 1 ft in front.
- Random turn – After every ~1 ft forward, turn randomly ±15°.
- Drive forward – Default motion.
- Room: 15 ft × 10 ft with 5 ft doorway.
- Hallway: 5 ft wide, L-shaped, 20 ft long (long branch), 15 ft long (short branch), doorway at 5 ft from the end.
- Robot: TurtleBot 2 with Kobuki base + Kinect sensor.
- Gazebo: World built using Gazebo Building Editor (
project1_world.world).
catkin_wsp1/
├── src/
│ ├── project1_pkg/
│ │ ├── launch/
│ │ │ ├── project1_world.launch
│ │ │ ├── project1_mapping.launch
│ │ ├── rviz/
│ │ │ └── project1_mapping.rviz
│ │ ├── scripts/
│ │ │ └── reactive_controller.py
│ │ ├── worlds/
│ │ │ └── project1_world.world
│ │ ├── models/ (if custom walls/room models created)
│ │ ├── CMakeLists.txt
│ │ └── package.xml
cd ~/catkin_wsp1/src git clone https://github.com//.git project1_pkg cd ~/catkin_wsp1 catkin_make
Source Workspace source devel/setup.bash
Ensure Dependencies Installed
On CNS Linux machines, make sure the following packages are available (most preinstalled):
ROS Melodic
Gazebo 9
TurtleBot 2 packages:
turtlebot_gazebo
turtlebot_description
kobuki_description
depthimage_to_laserscan
slam_gmapping
yocs_controllers, yocs_cmd_vel_mux
If any are missing, copy them into ~/catkin_wsp1/src/ from the default workspace:
cp -r ~/catkin_ws/src/ ~/catkin_wsp1/src/
🚀 Running the Project
- Start Simulation + Controller roslaunch project1_pkg project1_world.launch
Starts Gazebo with the world.
Spawns TurtleBot.
Launches depthimage_to_laserscan.
Runs reactive_controller.py.
- Start Mapping with RViz roslaunch project1_pkg project1_mapping.launch
Runs slam_gmapping.
Opens RViz with /map, /scan, /odom, and TF preloaded.
- Teleoperation (optional)
rosrun turtlebot_teleop turtlebot_teleop_key
Use arrow keys (or i, j, k, l depending on teleop mode).
📊 Visualization in RViz
/map → Occupancy grid (mapping).
/scan → 2D laser scan (from Kinect depth camera).
/odom → Robot odometry (blue arrow).
TF tree → map → odom → base_footprint.
🤖 Reactive Controller (reactive_controller.py)
Bumper: Stops robot + triggers escape.
Laser scan: Used for symmetric escape, wall avoidance, and obstacle avoidance.
Odometry: Tracks distance → triggers random turn after ~0.3 m (~1 ft).
Teleop override: Accepts keyboard inputs but blocks unsafe ones.
🧪 Testing
Robot should move autonomously through the room + hallway.
Mapping builds incrementally in RViz.
Teleop should override autonomous driving.
Robot escape symmetric obstacles (~180° turn), avoid asymmetric ones, and execute random turns ~every 1 ft.
👥 Team Members
Subhash Chandra
Brandon Aviles
References:
[1]University of Oklahoma, School of Computer Science, CS 4023/5023 Intelligent Robotics – Fall 2024 Project 1: Reactive Robotics Using ROS and TurtleBot (Code Base), Norman, OK, USA, 2024.
[2] Open Robotics, Robot Operating System (ROS) Documentation. [Online]. Available: https://wiki.ros.org
[3] Open Robotics, Gazebo Simulation Environment Documentation. [Online]. Available: https://gazebosim.org
[4] R. A. Brooks, “A Robust Layered Control System for a Mobile Robot,” IEEE J. Robotics and Automation, vol. 2, no. 1, pp. 14–23, 1986.
[5] J. L. Jones, A. M. Flynn, and B. A. Seiger, Mobile Robots: Inspiration to Implementation, 2nd ed. Natick, MA, USA: A K Peters, 1999.
[6] R. R. Murphy, Introduction to AI Robotics. Cambridge, MA, USA: MIT Press, 2000.