top of page

Welcome
to Master Thesis

The Master's thesis is conducted in collaboration with IPH Hannover gGmbH. It focuses on developing a fusion of visual and point cloud SLAM algorithms to enable an autonomous drone to safely and robustly explore the production environment. The drone generates a 3D scan of the factory, which serves as the foundation for creating a digital twin of the smart factory.

Motivation

IPH Hannover gGmbH is conducting research work to push the Artificial Intelligence in Manufacturing leading to Sustainability and Industry5.0(AIMS 5.0) project. An autonomous drone should be developed for creating digital twin for the smart factory. The drone system should be facilitated with a SLAM system to enable accurate and robust localization and mapping.

 

Most SLAM systems only fuse LiDAR measurements with IMU measurements or camera measurements with IMU measurements, which results in incapability of fast, consistent and robust mapping of the environment. ORB-SLAM3 fuses camera measurements with IMU measurements. It can process short-term, mid-term and long-term data association. But it can not create a dense 3D scan map and it can not provide accurate tracking result when facing very aggregate motions, as shown in picture.

ORBMAPPING.png

ORB-SLAM3 Mapping result, sparse point map, incapable of handling very aggressive motion.

FAST_LIO_Mapping.png

FAST-LIO2 can generate dense 3D scan map and deliver precise local mapping result. Due to the intrinsic of EKF, it is able to process data in very fast speed which enables it to handle very aggressive motion. But it will fail when facing degenerated environment like featureless long corridor. And it also does not have long-term data association like loop closing.

Concept

The hardware system of the autonomous drone is depicted in following picture. It is equipped with two fisheye camera in the front. At its tail, a lightweight LiDAR is installed to gain a large sensor field of view.

topview.jpeg
sideview.jpeg

The master’s thesis focuses on developing a hybrid SLAM system that integrates ORB-SLAM3 and FAST-LIO2 to enable robust state estimation and accurate mapping for autonomous drones. By combining the strengths of LiDAR-Inertial Odometry (FAST-LIO2) and Visual-Inertial SLAM (ORB-SLAM3), the system ensures reliable navigation and 3D mapping in production environments.

Key Features:

  • System-Level Fusion: Instead of fusing raw sensor data, the system integrates the estimation results of FAST-LIO2 and ORB-SLAM3 for better accuracy and consistency.

  • Mutual Assistance Strategy: FAST-LIO2 assists ORB-SLAM3 when LiDAR data is reliable, while ORB-SLAM3 enhances FAST-LIO2 in degenerated LiDAR environments.

  • Data Association Optimization: ORB-SLAM3’s loop closing is leveraged for long-term data correction, complementing FAST-LIO2’s short-term accuracy.

imags.jpg

System overview of developed system.

Final Result

The developed SLAM system is able to provide precise and consistent mapping result by combining the strength of the ORB-SLAM3 and FAST-LIO2, even when facing degenerated environment like long corridor, as shown in the picture.

GoodMapping.png

Precise and consistent mapping result of the developed system.

Thanks to the loop closing feature of the ORB-SLAM3, the developed system is also able to perform loop closing inside the LiDAR scan map, which is very important for the autonomous exploration and digital twin creation.

loopclosinggood.png

Loop closed and the LiDAR subsystem tracks in old LiDAR map

newloop.png

Loop closing after a very long journey and transversing various environment

Thesis

Final Grade

1.3/1 (equivelant to 96/100)

Contact
Information

Hannover, Germany

+0049 017648990630

  • LinkedIn

Thanks for submitting!

©2023 by Yumao Liu. Powered and secured by Wix

bottom of page