elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Barrierefreiheit | Kontakt | English
Schriftgröße: [-] Text [+]

Multi-Sensor and Multi-Modal Localization in Indoor Environments on Robotic Platforms

Sewtz, Marco (2025) Multi-Sensor and Multi-Modal Localization in Indoor Environments on Robotic Platforms. Dissertation, Karlsruher Institut für Technologie (KIT).

[img] PDF
4MB

Kurzfassung

This dissertation presents a detailed exploration of multi-sensor visual odometry systems, with a focus on enhancing robustness in indoor environments. The core contribution lies in the development of an advanced ego-state estimation framework, wherein visual odometry is bolstered by multiple visual sensors to overcome common challenges such as occlusions and textureless surfaces. By addressing frequent loss-of-tracking (LoT) events, the system ensures continuous, reliable localization in complex, cluttered indoor settings, such as households and elderly care facilities. To further augment situational awareness, sound source localization (SSL) is integrated as a complementary modality. Its fusion with visual data significantly enhances the robot’s perception of the environment, enabling the detection and identification of objects and events that may be visually occluded or otherwise undetectable. This multi-modal fusion provides a more holistic understanding of the robot’s surroundings, contributing to improved operational reliability in dynamic, human-centered environments. A key feature of this research is the introduction of IndoorMCD, a novel multisensor benchmark specifically designed to evaluate localization performance in indoor environments. Additionally, this work introduces URSim, an online real-time visual simulation framework that enables rigorous testing of multisensor localization systems under various conditions. Extensive experimental validation, using both real-world scenarios and simulated environments, demonstrates the robustness and fault tolerance of the proposed system. This research advances the state-of-the-art in robotic perception and indoor localization by providing a multi-modal, fault-tolerant approach to localization, offering valuable contributions to both theoretical understanding and practical application in robotics.

elib-URL des Eintrags:https://elib.dlr.de/215664/
Dokumentart:Hochschulschrift (Dissertation)
Titel:Multi-Sensor and Multi-Modal Localization in Indoor Environments on Robotic Platforms
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Sewtz, MarcoMarco.Sewtz (at) dlr.dehttps://orcid.org/0000-0003-1662-534XNICHT SPEZIFIZIERT
Datum:12 Februar 2025
Open Access:Ja
Status:veröffentlicht
Stichwörter:Robotic, SLAM, VO, Visual Odometry, Perception, Audio, Proprioception, Audio Localization, Localization
Institution:Karlsruher Institut für Technologie (KIT)
Abteilung:KIT-Fakultät für Informatik
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Robotik
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R RO - Robotik
DLR - Teilgebiet (Projekt, Vorhaben):R - Multisensorielle Weltmodellierung (RM) [RO]
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Robotik und Mechatronik (ab 2013) > Perzeption und Kognition
Hinterlegt von: Sewtz, Marco
Hinterlegt am:01 Aug 2025 14:01
Letzte Änderung:01 Aug 2025 14:01

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
OpenAIRE Validator logo electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.