Lasheras Hernandez, Blanca und Strobl, Klaus H. und Izquierdo, Sergio und Bodenmüller, Tim und Triebel, Rudolph und Civera, Javier (2025) Single-Shot Metric Depth from Focused Plenoptic Cameras. In: 2025 IEEE International Conference on Robotics and Automation, ICRA 2025, Seiten 9566-9573. IEEE. 2025 IEEE International Conference on Robotics and Automation, 2025-05-19 - 2025-05-23, Atlanta, GA, USA. doi: 10.1109/ICRA55743.2025.11128276. ISBN 979-833154139-2. ISSN 1050-4729.
|
PDF
3MB |
Offizielle URL: https://ieeexplore.ieee.org/document/11128276
Kurzfassung
Metric depth estimation from visual sensors is crucial for robots to perceive, navigate, and interact with their environment. Traditional range imaging setups, such as stereo or structured light cameras, face hassles including calibration, occlusions, and hardware demands, with accuracy limited by the baseline between cameras. Single- and multi-view monocular depth offers a more compact alternative, but is constrained by the unobservability of the metric scale. Light field imaging provides a promising solution for estimating metric depth by using a unique lens configuration through a single device. However, its application to single-view dense metric depth is under-addressed mainly due to the technology's high cost, the lack of public benchmarks, and proprietary geometrical models and software. Our work explores the potential of focused plenoptic cameras for dense metric depth. We propose a novel pipeline that predicts metric depth from a single plenoptic camera shot by first generating a sparse metric point cloud using a neural network, which is then used to scale and align a dense relative depth map regressed by a foundation depth model, resulting in a dense metric depth. To validate it, we curated the Light Field & Stereo Image Dataset11Dataset available at https://zenodo.org/records/14224205. (LFS) of real-world light field images with stereo depth labels, filling a current gap in existing resources. Experimental results show that our pipeline produces accurate metric depth predictions, laying a solid groundwork for future research in this field.22Work partially supported by the DLR Impulse Project SaiNSOR.
| elib-URL des Eintrags: | https://elib.dlr.de/217313/ | ||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Dokumentart: | Konferenzbeitrag (Vortrag) | ||||||||||||||||||||||||||||
| Titel: | Single-Shot Metric Depth from Focused Plenoptic Cameras | ||||||||||||||||||||||||||||
| Autoren: |
| ||||||||||||||||||||||||||||
| Datum: | 2 September 2025 | ||||||||||||||||||||||||||||
| Erschienen in: | 2025 IEEE International Conference on Robotics and Automation, ICRA 2025 | ||||||||||||||||||||||||||||
| Referierte Publikation: | Ja | ||||||||||||||||||||||||||||
| Open Access: | Ja | ||||||||||||||||||||||||||||
| Gold Open Access: | Nein | ||||||||||||||||||||||||||||
| In SCOPUS: | Ja | ||||||||||||||||||||||||||||
| In ISI Web of Science: | Nein | ||||||||||||||||||||||||||||
| DOI: | 10.1109/ICRA55743.2025.11128276 | ||||||||||||||||||||||||||||
| Seitenbereich: | Seiten 9566-9573 | ||||||||||||||||||||||||||||
| Verlag: | IEEE | ||||||||||||||||||||||||||||
| ISSN: | 1050-4729 | ||||||||||||||||||||||||||||
| ISBN: | 979-833154139-2 | ||||||||||||||||||||||||||||
| Status: | veröffentlicht | ||||||||||||||||||||||||||||
| Stichwörter: | plenoptic cameras; light-field cameras; monocular depth | ||||||||||||||||||||||||||||
| Veranstaltungstitel: | 2025 IEEE International Conference on Robotics and Automation | ||||||||||||||||||||||||||||
| Veranstaltungsort: | Atlanta, GA, USA | ||||||||||||||||||||||||||||
| Veranstaltungsart: | internationale Konferenz | ||||||||||||||||||||||||||||
| Veranstaltungsbeginn: | 19 Mai 2025 | ||||||||||||||||||||||||||||
| Veranstaltungsende: | 23 Mai 2025 | ||||||||||||||||||||||||||||
| Veranstalter : | IEEE | ||||||||||||||||||||||||||||
| HGF - Forschungsbereich: | Luftfahrt, Raumfahrt und Verkehr | ||||||||||||||||||||||||||||
| HGF - Programm: | Raumfahrt | ||||||||||||||||||||||||||||
| HGF - Programmthema: | Robotik | ||||||||||||||||||||||||||||
| DLR - Schwerpunkt: | Raumfahrt | ||||||||||||||||||||||||||||
| DLR - Forschungsgebiet: | R RO - Robotik | ||||||||||||||||||||||||||||
| DLR - Teilgebiet (Projekt, Vorhaben): | R - Impulsprojekt SaiNSOR [RO], R - Multisensorielle Weltmodellierung (RM) [RO] | ||||||||||||||||||||||||||||
| Standort: | Oberpfaffenhofen | ||||||||||||||||||||||||||||
| Institute & Einrichtungen: | Institut für Robotik und Mechatronik (ab 2013) > Perzeption und Kognition Institut für Robotik und Mechatronik (ab 2013) | ||||||||||||||||||||||||||||
| Hinterlegt von: | Strobl, Dr.-Ing. Klaus H. | ||||||||||||||||||||||||||||
| Hinterlegt am: | 20 Okt 2025 06:49 | ||||||||||||||||||||||||||||
| Letzte Änderung: | 20 Okt 2025 06:49 |
Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags