Lingenauber, Martin und Strobl, Klaus und Oumer, Nassir W. und Kriegel, Simon (2017) Benefits of plenoptic cameras for robot vision during close range on-orbit servicing maneuvers. In: IEEE Aerospace Conference Proceedings, Seiten 1-18. Institute of Electrical and Electronics Engineers (IEEE). IEEE Aerospace Conference 2017, 2017-03-04 - 2017-03-11, Big Sky. MT. USA. doi: 10.1109/AERO.2017.7943666.
PDF
5MB |
Offizielle URL: http://ieeexplore.ieee.org/abstract/document/7943666/?reload=true
Kurzfassung
This paper discusses the potential benefits of plenoptic cameras for robot vision during on-orbit servicing missions. Robot vision is essential for the accurate and reliable positioning of a robotic arm with millimeter accuracy during tasks such as grasping, inspection or repair that are performed in close range to a client satellite. Our discussion of the plenoptic camera technology provides an overview of the conceptional advantages for robot vision with regard to the conditions during an on-orbit servicing mission. A plenoptic camera, also known as light field camera, is basically a conventional camera system equipped with an additional array of lenslets, the micro lens array, at a distance of a few micrometers in front of the camera sensor. Due to the micro lens array it is possible to record not only the incidence location of a light ray but also its incidence direction on the sensor, resulting in a 4-D data set known as a light field. The 4-D light field allows to derive regular 2-D intensity images with a significantly extended depth of field compared to a conventional camera. This results in a set of advantages, such as software based refocusing or increased image quality in low light conditions due to recording with an optimal aperture while maintaining an extended depth of field. Additionally, the parallax between corresponding lenslets allows to derive 3-D depth images from the same light field and therefore to substitute a stereo vision system with a single camera. Given the conceptual advantages, we investigate what can be expected from plenoptic cameras during close range robotic operations in the course of an on-orbit servicing mission. This includes topics such as image quality, extension of the depth of field, 3-D depth map generation and low light capabilities. Our discussion is backed by image sequences for an on-orbit servicing scenario that were recorded in a representative laboratory environment with simulated in-orbit illumination conditions. We mounted a plenoptic camera on a robot arm and performed an approach trajectory from up to 2\ m towards a full-scale satellite mockup. Using these images, we investigated how the light field processing performs, e.g. in terms of depth of field extension, image quality and depth estimation. We were also able to show the applicability of images derived from light fields for the purpose of the visual based pose estimation of a target point.
elib-URL des Eintrags: | https://elib.dlr.de/112140/ | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dokumentart: | Konferenzbeitrag (Vortrag) | ||||||||||||||||||||
Titel: | Benefits of plenoptic cameras for robot vision during close range on-orbit servicing maneuvers | ||||||||||||||||||||
Autoren: |
| ||||||||||||||||||||
Datum: | März 2017 | ||||||||||||||||||||
Erschienen in: | IEEE Aerospace Conference Proceedings | ||||||||||||||||||||
Referierte Publikation: | Ja | ||||||||||||||||||||
Open Access: | Ja | ||||||||||||||||||||
Gold Open Access: | Nein | ||||||||||||||||||||
In SCOPUS: | Ja | ||||||||||||||||||||
In ISI Web of Science: | Nein | ||||||||||||||||||||
DOI: | 10.1109/AERO.2017.7943666 | ||||||||||||||||||||
Seitenbereich: | Seiten 1-18 | ||||||||||||||||||||
Verlag: | Institute of Electrical and Electronics Engineers (IEEE) | ||||||||||||||||||||
Status: | veröffentlicht | ||||||||||||||||||||
Stichwörter: | plenoptic camera, light field, on-orbit servicing, robot vision | ||||||||||||||||||||
Veranstaltungstitel: | IEEE Aerospace Conference 2017 | ||||||||||||||||||||
Veranstaltungsort: | Big Sky. MT. USA | ||||||||||||||||||||
Veranstaltungsart: | internationale Konferenz | ||||||||||||||||||||
Veranstaltungsbeginn: | 4 März 2017 | ||||||||||||||||||||
Veranstaltungsende: | 11 März 2017 | ||||||||||||||||||||
Veranstalter : | IEEE | ||||||||||||||||||||
HGF - Forschungsbereich: | Luftfahrt, Raumfahrt und Verkehr | ||||||||||||||||||||
HGF - Programm: | Raumfahrt | ||||||||||||||||||||
HGF - Programmthema: | Technik für Raumfahrtsysteme | ||||||||||||||||||||
DLR - Schwerpunkt: | Raumfahrt | ||||||||||||||||||||
DLR - Forschungsgebiet: | R SY - Technik für Raumfahrtsysteme | ||||||||||||||||||||
DLR - Teilgebiet (Projekt, Vorhaben): | R - Vorhaben Multisensorielle Weltmodellierung (alt) | ||||||||||||||||||||
Standort: | Oberpfaffenhofen | ||||||||||||||||||||
Institute & Einrichtungen: | Institut für Robotik und Mechatronik (ab 2013) > Perzeption und Kognition | ||||||||||||||||||||
Hinterlegt von: | Lingenauber, Martin | ||||||||||||||||||||
Hinterlegt am: | 19 Jun 2017 11:36 | ||||||||||||||||||||
Letzte Änderung: | 24 Apr 2024 20:16 |
Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags