Bio-Inspired Robot: Triceratops (VSLAM)

To enable autonomous and safe indoor mobility for the robot, we enhanced its environmental awareness by implementing a Realsense D435 depth camera. Through the integration of Nvidia Isaac ROS VSLAM and AprilTag SLAM technologies, the robot gained the ability to create real-time environmental maps and accurately determine its position.

Isaac ROS VSLAM

The Triceratops robot employs an Intel RealSense D435 depth camera with Isaac ROS Visual SLAM for real-time localization. This stereo vision system, combined with the robot's IMU sensor, enables precise Visual-Inertial Odometry through GPU-accelerated feature matching. The solution provides reliable odometry and simultaneous mapping, ideal for GPS-limited indoor environments.

Isaac ROS VSLAM

AprilTag Localization System & Nav2

The AprilTag localization system determines a robot's position through a series of coordinate transformations. When the system detects an AprilTag, it utilizes two key transformations: the base_to_tag transform obtained from camera detection that represents the relationship between the robot's base and the tag, and the map_to_tag transform loaded from configuration files that defines the tag's known position in the map.

The Triceratops robot integrates ROS 2's Nav2 navigation framework, utilizing a DWB (Dynamic Window Based) local planner for real-time obstacle avoidance and global path planning to calculate optimal routes.

AprilTag Localization Test

Demo Videos

Triceratops Moving Video

Visual Navigation Demo Video

© Copyright 2025 Bruce Lin