A ROS2 workspace for the Fastbot mobile robot platform with differential drive, LiDAR sensor integration, and Gazebo simulation support.
🎥 Click to watch Fastbot in action!
This repository contains ROS2 packages for controlling and simulating the Fastbot robot. The project includes robot description files, hardware drivers, simulation environments, and launch configurations.
This project is inspired by The Construct's FastBot - an open-source ROS 2 robot kit for hands-on robotics learning. Want to learn ROS 2 and robotics? Check out The Construct's courses for comprehensive hands-on training.
This repository provides the complete ROS 2 software stack for building and programming your own FastBot robot. 🤖
Robot description package containing URDF/Xacro files and 3D models for the Fastbot platform.
- URDF/Xacro robot models
- 3D meshes and visual models
- OnShape CAD integration
Launch configurations to bring up the Fastbot robot with all sensors and actuators.
- System-level launch files
- Hardware initialization
- Sensor and actuator startup
Gazebo simulation environment for the Fastbot robot.
- Gazebo world files
- Simulation models
- Simulated sensor configurations
Python package for differential drive control via serial communication.
- Serial motor driver node
- Differential drive kinematics
- Motor control interface
Custom ROS2 message definitions for serial motor communication.
- Motor command messages
- Motor status messages
ROS2 driver for Leishen LiDAR sensors.
- LiDAR data acquisition
- LiDAR configuration
The docs/ directory contains build guides and hardware documentation:
- Build Guide - Step-by-step robot build evolution
- Hardware - Physical components and fabrication files (3D printing, etc.)
See the documentation README for more details.
Install Ubuntu Server 22.04 LTS on your Raspberry Pi (4 or 5 recommended).
Install ROS2 Humble:
# Add ROS2 repository
sudo apt update && sudo apt install -y software-properties-common
sudo add-apt-repository universe
sudo apt update && sudo apt install -y curl
sudo curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.key -o /usr/share/keyrings/ros-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] http://packages.ros.org/ros2/ubuntu $(. /etc/os-release && echo $UBUNTU_CODENAME) main" | sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null
# Install ROS2 Humble
sudo apt update
sudo apt install -y ros-humble-ros-base ros-humble-demo-nodes-cpp ros-humble-teleop-twist-keyboard ros-humble-rmw-cyclonedds-cpp ros-humble-joint-state-publisher
sudo apt install -y ros-dev-tools git-all
# Add to .bashrc
echo "source /opt/ros/humble/setup.bash" >> ~/.bashrc
source ~/.bashrcFlash the Arduino Nano with ros_arduino_bridge firmware:
- Download ros_arduino_bridge
- Open
ROSArduinoBridge.inoin Arduino IDE - Configure encoder pins and motor driver pins in
encoder_driver.handmotor_driver.h - Upload to Arduino Nano
Set up udev rules for consistent device naming:
# Find Arduino identifiers
udevadm info -a -n /dev/ttyUSB0 | grep '{idVendor}\|{idProduct}'
# Create udev rule
sudo nano /etc/udev/rules.d/99-arduino.rules
# Add this line (adjust idVendor and idProduct as needed):
SUBSYSTEM=="tty", ATTRS{idVendor}=="1a86", ATTRS{idProduct}=="7523", SYMLINK+="arduino_nano"
# Reload udev rules
sudo udevadm control --reload-rules
sudo udevadm triggerInstall all required libraries for LiDAR, camera, and motor control:
# LiDAR dependencies
sudo apt install -y \
libboost-all-dev \
libpcl-dev \
libpcap-dev \
ros-humble-diagnostic-updater \
ros-humble-pcl-conversions
# Camera dependencies
sudo apt install -y \
ros-humble-v4l2-camera \
ros-humble-image-transport-plugins \
v4l-utilsEnable legacy camera driver for compatibility with ros2_v4l2_camera:
sudo nano /boot/firmware/config.txtSet camera_autodetect=0 and add start_x=1 at the end:
camera_autodetect=0
# Enable camera (legacy driver)
start_x=1
Reboot after changes:
sudo rebootClone this repository into your ROS2 workspace:
# Create ROS2 workspace
mkdir -p ~/ros2_ws/src
cd ~/ros2_ws/src
# Clone the repository
git clone https://github.com/legalaspro/fastbot_ros2.git fastbot_ros2
# Build the workspace
cd ~/ros2_ws
colcon build --symlink-install
source install/setup.bashLaunch the complete system with all sensors and motors:
ros2 launch fastbot_bringup bringup.launch.xmlThis will start:
- Serial motor driver (connects to
/dev/arduino_nano) - LSlidar N10 driver (connects to
/dev/lslidar) - Raspberry Pi camera
- All necessary transforms and configurations
After launching the robot with bringup, you can visualize the robot in RViz:
# First, launch the robot bringup (in one terminal)
ros2 launch fastbot_bringup bringup.launch.xml
# Then, launch RViz for visualization (in another terminal)
ros2 launch fastbot_description display.launch.pyThe RViz display will show:
- Robot model with real-time joint states
- TF frames
- LiDAR scan data
- Camera image feed
Launch Gazebo simulation:
ros2 launch fastbot_gazebo one_fastbot_warehouse.launch.pyControl the robot with keyboard:
ros2 run teleop_twist_keyboard teleop_twist_keyboard --ros-args --remap cmd_vel:=/fastbot/cmd_velLaunch components separately if needed:
# Motor control only
ros2 run serial_motor serial_motor_node
# LiDAR only
ros2 launch lslidar_driver lslidar_launch.py
# Camera only
ros2 run v4l2_camera v4l2_camera_nodeFor detailed step-by-step instructions on building the robot from scratch, see the Build Guide. The build guide shows the incremental evolution of the robot:
- Basic motor control setup (Arduino + motors + encoders)
- Power module integration
- Raspberry Pi integration
- LiDAR sensor addition
- Camera integration
This project is licensed under the MIT License - see the LICENSE file for details.
Dmitri Manajev

