elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Sensor Fusion for Visual-Inertial Simultaneous Localisation and Mapping

Bekkers, Sam (2024) Sensor Fusion for Visual-Inertial Simultaneous Localisation and Mapping. Masterarbeit, Delft University of Technology.

[img] PDF
11MB

Kurzfassung

The generation of a 3D map of an unseen environment, obtained through solving the SLAM problem, is a popular topic currently in the robotics domain. The Lunar Rover Mini (LRM) at the German Aerospace Center solves this problem using a RGB-D camera system, which is favourable in space applications due to its lightweight characteristics and energy-efficiency. Performing SLAM based on camera images is based on visual odometry: the science of estimating the rover's trajectory trough a sequence of images. However, the dependency on a single sensor to perform mapping and navigation poses a threat to the reliability of the system. To increase the reliability and robustness of the SLAM algorithm, an inertial measurement unit (IMU) is incorporated in the robot hardware. This thesis describes the design for a visual-inertial SLAM algorithm that incorporates both visual and inertial measurements to solve the SLAM problem through performing tightly coupled sensor fusion, which estimates and corrects for IMU biases. The solution is based on a non-linear factor graph, which is a graphical model to represent the relationships between the rover's measurements and the unknown variables which are optimised for. This is done using the open-source GTSAM framework. Using experimental data, the robustness of the novel visual-inertial SLAM algorithm is demonstrated by simulating specific sensor failures. Moreover, the novel algorithm shows its capability to incorporate a degree of certainty regarding specific areas of the generated map, closely resembling how a human being would generate a map of an unknown area. An additional use case for tightly coupled sensor fusion is the increased accuracy of the estimated trajectory. Assuming Gaussian noise models for both measurement models, averaging the two can yield a higher accuracy than either of the two sensors could have obtained by itself. This hypothesis was tested in another experiment. As the main mechanism behind bias estimation is reducing the error between visual and inertial measurements, bias estimation is quickly affected by this drifting visual odometry, which in its turn deteriorates the accuracy of the visual-inertial odometry module. This observation proves that the bias estimation is not correlated to the underlying physical process, but is rather just a numerical value in the optimisation reducing the residual error. It raises the question whether this strategy of tightly coupled sensor fusion can actually be used to increase the accuracy of a visual odometry algorithm.

elib-URL des Eintrags:https://elib.dlr.de/212023/
Dokumentart:Hochschulschrift (Masterarbeit)
Titel:Sensor Fusion for Visual-Inertial Simultaneous Localisation and Mapping
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Bekkers, SamDLRNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Datum:2024
Open Access:Ja
Seitenanzahl:68
Status:veröffentlicht
Stichwörter:mapping, 3D, SLAM, LRM, Lunar Rover Mini, lightweight, sensor fusion
Institution:Delft University of Technology
Abteilung:Faculty of Mechanical, Maritime and Materials Engineering (3mE)
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Robotik
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R RO - Robotik
DLR - Teilgebiet (Projekt, Vorhaben):R - Autonome, lernende Roboter [RO]
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Robotik und Mechatronik (ab 2013)
Hinterlegt von: Geyer, Günther
Hinterlegt am:21 Jan 2025 10:08
Letzte Änderung:21 Jan 2025 10:08

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.