elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Single-View Depth from Focused Plenoptic Cameras

Lasheras Hernández, Blanca (2024) Single-View Depth from Focused Plenoptic Cameras. Masterarbeit, Universidad de Zaragoza.

[img] PDF - Nur DLR-intern zugänglich
40MB

Kurzfassung

In recent years, the research progress in computer vision has boosted the capabilities of machines for interpreting visual data, thereby expanding the complexity and range of tasks that robots could perform in fields such as autonomous driving, medicine, and industrial automation. A principal facet of computer vision is depth estimation, crucial for enabling robots to perceive, navigate, and interact with their environment in an effective and safe manner. Traditional setups, like stereo or multi-camera, face challenges such as calibration intricacies and computational and hardware complexity. Further, their accuracy is limited by the baseline between the cameras. Monocular depth estimation, thus using a single camera, offers a more compact alternative but is however limited by the unobservability of the scale. Light field imaging technologies represent a promising solution to the above issues by capturing both the intensity and direction of light rays not only through the main lens, but also through a large number of microlenses placed within the camera. By these means, depth in front of the camera can be measured owing to depth-dependent refraction at the main lens. Despite their potential, there are limited studies exploring their application to single-view dense depth estimation. This scarcity can be attributed to several factors. The technology remains relatively costly and inaccessible for its widespread adoption, leading to a lack of datasets suitable for training deep neural networks. As a consequence, few projects have used light field imaging for depth estimation, and existing efforts often rely on outdated iterations of the technology. Furthermore, the lack of an open-source geometrical model impedes the development of model-based estimation. This thesis explores the potential of focused plenoptic cameras for single-view depth estimation using learning-based methods. The proposed approach integrates techniques from image processing, deep learning, and scale alignment achieved through foundational models and robust statistics, to generate dense metric depth maps. To support this approach, a novel real-world dataset of light field images with stereo depth labels was generated, addressing a current gap in existing resources. Experimental results demonstrate that the developed pipeline can reliably produce accurate metric depth predictions, setting a foundation for further research in this domain.

elib-URL des Eintrags:https://elib.dlr.de/205263/
Dokumentart:Hochschulschrift (Masterarbeit)
Titel:Single-View Depth from Focused Plenoptic Cameras
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Lasheras Hernández, BlancaUniversidad de ZaragozaNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Datum:2024
Open Access:Nein
Seitenanzahl:78
Status:veröffentlicht
Stichwörter:plenoptic cameras; depth estimation
Institution:Universidad de Zaragoza
Abteilung:Escuela de Ingeniería y Arquitectura
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Robotik
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R RO - Robotik
DLR - Teilgebiet (Projekt, Vorhaben):R - Impulsprojekt SaiNSOR [RO], R - Multisensorielle Weltmodellierung (RM) [RO]
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Robotik und Mechatronik (ab 2013) > Perzeption und Kognition
Institut für Robotik und Mechatronik (ab 2013)
Hinterlegt von: Strobl, Dr. Klaus H.
Hinterlegt am:15 Jul 2024 09:17
Letzte Änderung:15 Jul 2024 09:17

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.