Bio-Inspired Robot: Triceratops (VSLAM)

To enable autonomous and safe indoor mobility for the robot, we enhanced its environmental awareness by implementing a Realsense D435 depth camera. Through the integration of Nvidia Isaac ROS VSLAM and AprilTag SLAM technologies, the robot gained the ability to create real-time environmental maps and accurately determine its position.


Isaac ROS VSLAM

The Triceratops robot employs an Intel RealSense D435 depth camera with Isaac ROS Visual SLAM for real-time localization. This stereo vision system, combined with the robot’s IMU sensor, enables precise Visual-Inertial Odometry through GPU-accelerated feature matching. The solution provides reliable odometry and simultaneous mapping, ideal for GPS-limited indoor environments.

Isaac ROS VSLAM in action


AprilTag Localization System & Nav2

The AprilTag localization system determines the robot’s position through coordinate transformations. When an AprilTag is detected, the system uses the base_to_tag transform (from camera detection) and the map_to_tag transform (from pre-configured map data) to calculate its precise location.

The Triceratops robot also integrates ROS 2’s Nav2 navigation framework, utilizing a DWB (Dynamic Window Based) local planner for real-time obstacle avoidance and a global planner to calculate optimal routes.

AprilTag Localization Test


Demo Videos

Triceratops Moving Video
Visual Navigation Demo Video