Stoiber, Manuel und Sundermeyer, Martin und Triebel, Rudolph (2022) Iterative Corresponding Geometry: Fusing Region and Depth for Highly Efficient 3D Tracking of Textureless Objects. In: 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2022, Seiten 6845-6855. IEEE. 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022-06-18 - 2022-06-24, New Orleans, LA, USA. doi: 10.1109/CVPR52688.2022.00673. ISBN 978-166546946-3. ISSN 1063-6919.
PDF
920kB | |
PDF
253kB |
Offizielle URL: https://ieeexplore.ieee.org/document/9879565
Kurzfassung
Tracking objects in 3D space and predicting their 6DoF pose is an essential task in computer vision. State-of-the-art approaches often rely on object texture to tackle this problem. However, while they achieve impressive results, many objects do not contain sufficient texture, violating the main underlying assumption. In the following, we thus propose ICG, a novel probabilistic tracker that fuses region and depth information and only requires the object geometry. Our method deploys correspondence lines and points to iteratively refine the pose. We also implement robust occlusion handling to improve performance in real-world settings. Experiments on the YCB-Video, OPT, and Choi datasets demonstrate that, even for textured objects, our approach outperforms the current state of the art with respect to accuracy and robustness. At the same time, ICG shows fast convergence and outstanding efficiency, requiring only 1.3 ms per frame on a single CPU core. Finally, we analyze the influence of individual components and discuss our performance compared to deep learning-based methods. The source code of our tracker is publicly available.
elib-URL des Eintrags: | https://elib.dlr.de/189883/ | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dokumentart: | Konferenzbeitrag (Poster) | ||||||||||||||||
Titel: | Iterative Corresponding Geometry: Fusing Region and Depth for Highly Efficient 3D Tracking of Textureless Objects | ||||||||||||||||
Autoren: |
| ||||||||||||||||
Datum: | Juni 2022 | ||||||||||||||||
Erschienen in: | 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2022 | ||||||||||||||||
Referierte Publikation: | Ja | ||||||||||||||||
Open Access: | Ja | ||||||||||||||||
Gold Open Access: | Nein | ||||||||||||||||
In SCOPUS: | Ja | ||||||||||||||||
In ISI Web of Science: | Ja | ||||||||||||||||
DOI: | 10.1109/CVPR52688.2022.00673 | ||||||||||||||||
Seitenbereich: | Seiten 6845-6855 | ||||||||||||||||
Verlag: | IEEE | ||||||||||||||||
ISSN: | 1063-6919 | ||||||||||||||||
ISBN: | 978-166546946-3 | ||||||||||||||||
Status: | veröffentlicht | ||||||||||||||||
Stichwörter: | 3D Object Tracking, 6DoF Pose estimation, Region, Depth, Textureless, Sparse, Real-time, Probabilistic | ||||||||||||||||
Veranstaltungstitel: | 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) | ||||||||||||||||
Veranstaltungsort: | New Orleans, LA, USA | ||||||||||||||||
Veranstaltungsart: | internationale Konferenz | ||||||||||||||||
Veranstaltungsbeginn: | 18 Juni 2022 | ||||||||||||||||
Veranstaltungsende: | 24 Juni 2022 | ||||||||||||||||
Veranstalter : | The Computer Vision Foundation | ||||||||||||||||
HGF - Forschungsbereich: | Luftfahrt, Raumfahrt und Verkehr | ||||||||||||||||
HGF - Programm: | Raumfahrt | ||||||||||||||||
HGF - Programmthema: | Robotik | ||||||||||||||||
DLR - Schwerpunkt: | Raumfahrt | ||||||||||||||||
DLR - Forschungsgebiet: | R RO - Robotik | ||||||||||||||||
DLR - Teilgebiet (Projekt, Vorhaben): | R - Multisensorielle Weltmodellierung (RM) [RO], R - E3D: Algorithmen und Applikation (RM) [RO] | ||||||||||||||||
Standort: | Oberpfaffenhofen | ||||||||||||||||
Institute & Einrichtungen: | Institut für Robotik und Mechatronik (ab 2013) > Perzeption und Kognition | ||||||||||||||||
Hinterlegt von: | Stoiber, Manuel | ||||||||||||||||
Hinterlegt am: | 09 Nov 2022 11:18 | ||||||||||||||||
Letzte Änderung: | 24 Apr 2024 20:51 |
Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags