elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Imprint | Privacy Policy | Contact | Deutsch
Fontsize: [-] Text [+]

Collaborative Localization and Mapping for Autonomous Planetary Exploration: Distributed Stereo Vision-Based 6D SLAM in GNSS-Denied Environments

Schuster, Martin J. (2019) Collaborative Localization and Mapping for Autonomous Planetary Exploration: Distributed Stereo Vision-Based 6D SLAM in GNSS-Denied Environments. Dissertation, University of Bremen.

[img] PDF
32MB

Official URL: http://nbn-resolving.de/urn:nbn:de:gbv:46-00107650-19

Abstract

Mobile robots are a crucial element of present and future scientific missions to explore the surfaces of foreign celestial bodies such as Moon and Mars. The deployment of teams of robots allows to improve efficiency and robustness in such challenging environments. As long communication round-trip times to Earth render the teleoperation of robotic systems inefficient to impossible, on-board autonomy is a key to success. The robots operate in Global Navigation Satellite System (GNSS)-denied environments and thus have to rely on space-suitable on-board sensors such as stereo camera systems. They need to be able to localize themselves online, to model their surroundings, as well as to share information about the environment and their position therein. These capabilities constitute the basis for the local autonomy of each system as well as for any coordinated joint action within the team, such as collaborative autonomous exploration. In this thesis, we present a novel approach for stereo vision-based on-board and online Simultaneous Localization and Mapping (SLAM) for multi-robot teams given the challenges imposed by planetary exploration missions. We combine distributed local and decentralized global estimation methods to get the best of both worlds: A local reference filter on each robot provides real-time local state estimates required for robot control and fast reactive behaviors. We designed a novel graph topology to incorporate these state estimates into an online incremental graph optimization to compute global pose and map estimates that serve as input to higher-level autonomy functions. In order to model the 3D geometry of the environment, we generate dense 3D point cloud and probabilistic voxel-grid maps from noisy stereo data. We distribute the computational load and reduce the required communication bandwidth between robots by locally aggregating high-bandwidth vision data into partial maps that are then exchanged between robots and composed into global models of the environment. We developed methods for intra- and inter-robot map matching to recognize previously visited locations in semi- and unstructured environments based on their estimated local geometry, which is mostly invariant to light conditions as well as different sensors and viewpoints in heterogeneous multi-robot teams. A decoupling of observable and unobservable states in the local filter allows us to introduce a novel optimization: Enforcing all submaps to be gravity-aligned, we can reduce the dimensionality of the map matching from 6D to 4D. In addition to map matches, the robots use visual fiducial markers to detect each other. In this context, we present a novel method for modeling the errors of the loop closure transformations that are estimated from these detections. We demonstrate the robustness of our methods by integrating them on a total of five different ground-based and aerial mobile robots that were deployed in a total of 31 real-world experiments for quantitative evaluations in semi- and unstructured indoor and outdoor settings. In addition, we validated our SLAM framework through several different demonstrations at four public events in Moon and Mars-like environments. These include, among others, autonomous multi-robot exploration tests at a Moon-analogue site on top of the volcano Mt. Etna, Italy, as well as the collaborative mapping of a Mars-like environment with a heterogeneous robotic team of flying and driving robots in more than 35 public demonstration runs.

Item URL in elib:https://elib.dlr.de/132830/
Document Type:Thesis (Dissertation)
Title:Collaborative Localization and Mapping for Autonomous Planetary Exploration: Distributed Stereo Vision-Based 6D SLAM in GNSS-Denied Environments
Authors:
AuthorsInstitution or Email of AuthorsAuthor's ORCID iDORCID Put Code
Schuster, Martin J.UNSPECIFIEDhttps://orcid.org/0000-0002-6983-3719UNSPECIFIED
Date:21 August 2019
Refereed publication:Yes
Open Access:Yes
Number of Pages:254
Status:Published
Keywords:SLAM, collaborative SLAM, distributed SLAM, localization, mapping, stereo-vision, multi-robot, planetary exploration, autonomous robots
Institution:University of Bremen
Department:Institute for Artificial Intelligence
HGF - Research field:Aeronautics, Space and Transport
HGF - Program:Space
HGF - Program Themes:Space System Technology
DLR - Research area:Raumfahrt
DLR - Program:R SY - Space System Technology
DLR - Research theme (Project):R - Project MOREX [SY]
Location: Oberpfaffenhofen
Institutes and Institutions:Institute of Robotics and Mechatronics (since 2013)
Institute of Robotics and Mechatronics (since 2013) > Perception and Cognition
Deposited By: Schuster, Martin
Deposited On:07 Jan 2020 09:08
Last Modified:07 Jan 2020 09:08

Repository Staff Only: item control page

Browse
Search
Help & Contact
Information
electronic library is running on EPrints 3.3.12
Website and database design: Copyright © German Aerospace Center (DLR). All rights reserved.