Bio-Inspired Robot : Triceratops (VSLAM)

To enable autonomous and safe indoor mobility for the robot, we enhanced its environmental awareness by implementing a Realsense D435 depth camera. Through the integration of Nvidia Isaac ROS VSLAM and AprilTag SLAM technologies, the robot gained the ability to create real-time environmental maps and accurately determine its position. Finally, by incorporating the Nav2 navigation system, we achieved the robot's capability to independently plan routes, avoid obstacles, and successfully reach target destinations.

Drone

Isaac ROS VSLAM

The Triceratops robot employs an Intel RealSense D435 depth camera with Isaac ROS Visual SLAM for real-time localization. This stereo vision system, combined with the robot's IMU sensor, enables precise Visual-Inertial Odometry through GPU-accelerated feature matching. The solution provides reliable odometry and simultaneous mapping, ideal for GPS-limited indoor environments.

Drone

AprilTag localization system & Nav2

The AprilTag localization system determines a robot's position through a series of coordinate transformations. When the system detects an AprilTag, it utilizes two key transformations: the base_to_tag transform obtained from camera detection that represents the relationship between the robot's base and the tag, and the map_to_tag transform loaded from configuration files that defines the tag's known position in the map. By inverting the base_to_tag transform and multiplying it with the map_to_tag transform, the system uses the AprilTag as a reference point to effectively bridge the robot's local coordinate system with the global map coordinate system, thereby calculating the robot's precise position within the map frame.

The Triceratops robot integrates ROS 2's Nav2 navigation framework, utilizing a DWB (Dynamic Window Based) local planner for real-time obstacle avoidance and global path planning to calculate optimal routes.

Drone

Triceratops Moving Video

Visual Navigation Demo Video

© Copyright 2024 Bruce Lin