elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Combining Shape Completion and Grasp Prediction for Fast and Versatile Grasping with a Multi-Fingered Hand

Humt, Matthias und Winkelbauer, Dominik und Hillenbrand, Ulrich und Bäuml, Berthold (2024) Combining Shape Completion and Grasp Prediction for Fast and Versatile Grasping with a Multi-Fingered Hand. In: 22nd IEEE-RAS International Conference on Humanoid Robots, Humanoids 2023. IEEE. 2023 IEEE-RAS 22nd International Conference on Humanoid Robots (Humanoids), 2023-12-12 - 2023-12-14, Austin, TX, USA. doi: 10.1109/Humanoids57100.2023.10375210. ISBN 979-835030327-8. ISSN 2164-0572.

[img] PDF
5MB

Offizielle URL: https://ieeexplore.ieee.org/abstract/document/10375210

Kurzfassung

Grasping objects with limited or no prior knowledge about them is a highly relevant skill in assistive robotics. Still, in this general setting, it has remained an open problem, especially when it comes to only partial observability and versatile grasping with multi-fingered hands. We present a novel, fast, and high fidelity deep learning pipeline consisting of a shape completion module that is based on a single depth image, and followed by a grasp predictor that is based on the predicted object shape. The shape completion network is based on VQDIF and predicts spatial occupancy values at arbitrary query points. As grasp predictor, we use our two-stage architecture that first generates hand poses using an autoregressive model and then regresses finger joint configurations per pose. Critical factors turn out to be sufficient data realism and augmentation, as well as special attention to difficult cases during training. Experiments on a physical robot platform demonstrate successful grasping of a wide range of household objects based on a depth image from a single viewpoint. The whole pipeline is fast, taking only about 1 s for completing the object's shape (0.7s) and generating 1000 grasps (0.3s), (https://dlr-alr.github.io/2023-humanoids-completionl)

elib-URL des Eintrags:https://elib.dlr.de/202263/
Dokumentart:Konferenzbeitrag (Poster)
Titel:Combining Shape Completion and Grasp Prediction for Fast and Versatile Grasping with a Multi-Fingered Hand
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Humt, MatthiasMatthias.Humt (at) dlr.dehttps://orcid.org/0000-0002-1523-9335NICHT SPEZIFIZIERT
Winkelbauer, DominikDominik.Winkelbauer (at) dlr.dehttps://orcid.org/0000-0001-7443-1071NICHT SPEZIFIZIERT
Hillenbrand, UlrichUlrich.Hillenbrand (at) dlr.deNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Bäuml, BertholdBerthold.Baeuml (at) dlr.dehttps://orcid.org/0000-0002-4545-4765NICHT SPEZIFIZIERT
Datum:1 Januar 2024
Erschienen in:22nd IEEE-RAS International Conference on Humanoid Robots, Humanoids 2023
Referierte Publikation:Ja
Open Access:Ja
Gold Open Access:Nein
In SCOPUS:Ja
In ISI Web of Science:Nein
DOI:10.1109/Humanoids57100.2023.10375210
Verlag:IEEE
ISSN:2164-0572
ISBN:979-835030327-8
Status:veröffentlicht
Stichwörter:Shape Completion, Robotics, Grasping, Computer Vision, Deep Learning, Artificial Intelligence
Veranstaltungstitel:2023 IEEE-RAS 22nd International Conference on Humanoid Robots (Humanoids)
Veranstaltungsort:Austin, TX, USA
Veranstaltungsart:internationale Konferenz
Veranstaltungsbeginn:12 Dezember 2023
Veranstaltungsende:14 Dezember 2023
Veranstalter :IEEE/RSJ
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Robotik
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R RO - Robotik
DLR - Teilgebiet (Projekt, Vorhaben):R - Autonomie & Geschicklichkeit [RO]
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Robotik und Mechatronik (ab 2013) > Perzeption und Kognition
Hinterlegt von: Humt, Matthias
Hinterlegt am:02 Feb 2024 14:43
Letzte Änderung:24 Apr 2024 21:02

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.