Lingenauber, Martin and Strobl, Klaus and Oumer, Nassir W. and Kriegel, Simon (2017) Benefits of plenoptic cameras for robot vision during close range on-orbit servicing maneuvers. In: IEEE Aerospace Conference Proceedings, pp. 1-18. Institute of Electrical and Electronics Engineers (IEEE). IEEE Aerospace Conference 2017, 4-11 Mar 2017, Big Sky. MT. USA. doi: 10.1109/AERO.2017.7943666.
![]() |
PDF
5MB |
Official URL: http://ieeexplore.ieee.org/abstract/document/7943666/?reload=true
Abstract
This paper discusses the potential benefits of plenoptic cameras for robot vision during on-orbit servicing missions. Robot vision is essential for the accurate and reliable positioning of a robotic arm with millimeter accuracy during tasks such as grasping, inspection or repair that are performed in close range to a client satellite. Our discussion of the plenoptic camera technology provides an overview of the conceptional advantages for robot vision with regard to the conditions during an on-orbit servicing mission. A plenoptic camera, also known as light field camera, is basically a conventional camera system equipped with an additional array of lenslets, the micro lens array, at a distance of a few micrometers in front of the camera sensor. Due to the micro lens array it is possible to record not only the incidence location of a light ray but also its incidence direction on the sensor, resulting in a 4-D data set known as a light field. The 4-D light field allows to derive regular 2-D intensity images with a significantly extended depth of field compared to a conventional camera. This results in a set of advantages, such as software based refocusing or increased image quality in low light conditions due to recording with an optimal aperture while maintaining an extended depth of field. Additionally, the parallax between corresponding lenslets allows to derive 3-D depth images from the same light field and therefore to substitute a stereo vision system with a single camera. Given the conceptual advantages, we investigate what can be expected from plenoptic cameras during close range robotic operations in the course of an on-orbit servicing mission. This includes topics such as image quality, extension of the depth of field, 3-D depth map generation and low light capabilities. Our discussion is backed by image sequences for an on-orbit servicing scenario that were recorded in a representative laboratory environment with simulated in-orbit illumination conditions. We mounted a plenoptic camera on a robot arm and performed an approach trajectory from up to 2\ m towards a full-scale satellite mockup. Using these images, we investigated how the light field processing performs, e.g. in terms of depth of field extension, image quality and depth estimation. We were also able to show the applicability of images derived from light fields for the purpose of the visual based pose estimation of a target point.
Item URL in elib: | https://elib.dlr.de/112140/ | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Document Type: | Conference or Workshop Item (Speech) | |||||||||||||||
Title: | Benefits of plenoptic cameras for robot vision during close range on-orbit servicing maneuvers | |||||||||||||||
Authors: |
| |||||||||||||||
Date: | March 2017 | |||||||||||||||
Journal or Publication Title: | IEEE Aerospace Conference Proceedings | |||||||||||||||
Refereed publication: | Yes | |||||||||||||||
Open Access: | Yes | |||||||||||||||
Gold Open Access: | No | |||||||||||||||
In SCOPUS: | Yes | |||||||||||||||
In ISI Web of Science: | No | |||||||||||||||
DOI: | 10.1109/AERO.2017.7943666 | |||||||||||||||
Page Range: | pp. 1-18 | |||||||||||||||
Publisher: | Institute of Electrical and Electronics Engineers (IEEE) | |||||||||||||||
Status: | Published | |||||||||||||||
Keywords: | plenoptic camera, light field, on-orbit servicing, robot vision | |||||||||||||||
Event Title: | IEEE Aerospace Conference 2017 | |||||||||||||||
Event Location: | Big Sky. MT. USA | |||||||||||||||
Event Type: | international Conference | |||||||||||||||
Event Dates: | 4-11 Mar 2017 | |||||||||||||||
Organizer: | IEEE | |||||||||||||||
HGF - Research field: | Aeronautics, Space and Transport | |||||||||||||||
HGF - Program: | Space | |||||||||||||||
HGF - Program Themes: | Space System Technology | |||||||||||||||
DLR - Research area: | Raumfahrt | |||||||||||||||
DLR - Program: | R SY - Space System Technology | |||||||||||||||
DLR - Research theme (Project): | R - Vorhaben Multisensorielle Weltmodellierung (old) | |||||||||||||||
Location: | Oberpfaffenhofen | |||||||||||||||
Institutes and Institutions: | Institute of Robotics and Mechatronics (since 2013) > Perception and Cognition | |||||||||||||||
Deposited By: | Lingenauber, Martin | |||||||||||||||
Deposited On: | 19 Jun 2017 11:36 | |||||||||||||||
Last Modified: | 31 Jul 2019 20:09 |
Repository Staff Only: item control page