Stelzer, Annett (2016) Approaches to efficient visual homing of mobile robots in rough terrain. Dissertation, University of Freiburg.
PDF
11MB |
Official URL: https://freidok.uni-freiburg.de/data/11500
Abstract
Mobile robots, such as vacuum cleaning robots and robotic lawn mowers, have become part of our daily lives. While they work fully autonomously in well-defined environments, the demand for mobile robots in unstructured and unforeseeable areas, such as search-and-rescue scenarios or planetary exploration, is growing. These robots are not required to have full autonomy, but should rather be tools which support researchers or rescue workers by providing information about a remote environment. For this, they should offer basic autonomy functions, for example obstacle avoidance or autonomous returning. Robots for such tasks often have to be small and agile, which prevents them from carrying heavy sensors and batteries, and thus also limits their computational resources. Therefore, the implemented autonomous capabilities have to work efficiently. This thesis focusses on the task of robot homing, which is the ability of a robot to return to its starting position after moving away. Since the method should work in unknown, unstructured terrain, it is divided into a local navigation task, which aims at detecting and avoiding obstacles, and a global navigation task, which uses only bearing angles to landmarks to memorize and retrace a path. The method is applicable to ground-based robots equipped with an inertial measurement unit (IMU), a stereo camera, an omnidirectional camera and odometry sensors. Local obstacle avoidance is accomplished by creating a moving geometric grid map of the immediate surroundings of the robot. For this, the robot computes disparity images from the stereo image pairs and combines them to a dense grid map using the robust and accurate pose estimates obtained by fusing IMU data, visual odometry measurements and robot odometry data. From that, the robot estimates the traversability of the terrain and computes a cost map, which it uses to plan safe paths in a given direction. In contrast, no metric distance information is required for the global path learning and homing task. Instead, the robot only records landmark bearing angle configurations at certain locations along its path. The landmark observations are stored hierarchically by their degree of translation invariance in an efficient and scalable, novel data structure called Trail-Map (Translation Invariance Level Map). For retracing this path back to the home position, the robot computes homing vectors by comparing the current landmark configuration with the stored reference configuration. By combining the local and global navigation approach, a visual homing method for unstructured terrain is achieved, which has very low memory requirements and offers runtimes constant with respect to the length of the traversed path.
Item URL in elib: | https://elib.dlr.de/112692/ | ||||||||
---|---|---|---|---|---|---|---|---|---|
Document Type: | Thesis (Dissertation) | ||||||||
Title: | Approaches to efficient visual homing of mobile robots in rough terrain | ||||||||
Authors: |
| ||||||||
Date: | 2016 | ||||||||
Refereed publication: | Yes | ||||||||
Open Access: | Yes | ||||||||
Number of Pages: | 182 | ||||||||
Status: | Published | ||||||||
Keywords: | navigation; robots; camera; | ||||||||
Institution: | University of Freiburg | ||||||||
HGF - Research field: | Aeronautics, Space and Transport | ||||||||
HGF - Program: | Space | ||||||||
HGF - Program Themes: | Space System Technology | ||||||||
DLR - Research area: | Raumfahrt | ||||||||
DLR - Program: | R SY - Space System Technology | ||||||||
DLR - Research theme (Project): | R - Vorhaben Multisensorielle Weltmodellierung (old) | ||||||||
Location: | Oberpfaffenhofen | ||||||||
Institutes and Institutions: | Institute of Robotics and Mechatronics (since 2013) > Perception and Cognition | ||||||||
Deposited By: | Strobl, Dr. Klaus H. | ||||||||
Deposited On: | 19 Jun 2017 11:41 | ||||||||
Last Modified: | 31 Jul 2019 20:10 |
Repository Staff Only: item control page