Bäuml, Berthold (2019) Bringing a Humanoid Robot Closer to Human Versatility: Hard Realtime Software Architecture and Deep Learning Based Tactile Sensing. Dissertation, Universitat Bremen.
Dieses Archiv kann nicht den Volltext zur Verfügung stellen.
Offizielle URL: https://media.suub.uni-bremen.de/handle/elib/1558
Kurzfassung
For centuries, it has been a vision of man to create humanoid robots, i.e., machines that not only resemble the shape of the human body, but have similar capabilities, especially in dextrously manipulating their environment. But only in recent years it has been possible to build actual humanoid robots with many degrees of freedom (DOF) and equipped with torque controlled joints, which are a prerequisite for sensitively acting in the world. In this thesis, we extend DLRs advanced mobile torque controlled humanoid robot Agile Justin into two important directions to get closer to human versatility. First, we enable Agile Justin, which was originally built as a research platform for dextrous mobile manipulation, to also be able to execute complex dynamic manipulation tasks. We demonstrate this with the challenging task of catching up to two simultaneously thrown balls with its hands. Second, we equip Agile Justin with highly developed and deep learning based tactile sensing capabilities that are critical for dextrous fine manipulation. We demonstrate its tactile capabilities with the delicate task of identifying an objects material simply by gently sweeping with a fingertip over its surface. Key for the realization of complex dynamic manipulation tasks is a software framework that allows for a component based system architecture to cope with the complexity and parallel and distributed computational demands of deep sensor-perception-planningaction loops but under tight timing constraints. This thesis presents the communication layer of our aRDx (agile robot development next generation) software framework that provides hard realtime determinism and optimal transport of data packets with zero-copy for intra- and inter-process and copy-once for distributed communication. In the implementation of the challenging ball catching application on Agile Justin, we take full advantage of aRDxs performance and advanced features like channel synchronization. Besides developing the challenging visual ball tracking using only onboard sensing while everything is moving and the automatic and self-contained calibration procedure to provide the necessary precision, the major contribution is the unified generation of the reaching motion for the arms. The catch point selection, motion planning and the joint interpolation steps are subsumed in one nonlinear constrained optimization problem which is solved in realtime and allows for the realization of different catch behaviors. For the highly sensitive task of tactile material classification with a flexible pressuresensitive skin on Agile Justins fingertip, we present our deep convolutional network architecture TactNet II. The input is the raw 16000 dimensional complex and noisy spatiotemporal tactile signal generated when sweeping over an objects surface. For comparison, we perform a thorough human performance experiment with 15 subjects which shows that Agile Justin reaches superhuman performance in the high-level material classification task (What material id?), as well as in the low-level material differentiation task (Are two materials the same?). To increase the sample efficiency of TactNet II, we adapt state of the art deep end to end transfer learning to tactile material classification leading to an up to 15 fold reduction in the number of training samples needed. The presented methods led to six publication awards and award finalists and international media coverage but also worked robustly at many trade fairs and lab demos
elib-URL des Eintrags: | https://elib.dlr.de/185791/ | ||||||||
---|---|---|---|---|---|---|---|---|---|
Dokumentart: | Hochschulschrift (Dissertation) | ||||||||
Titel: | Bringing a Humanoid Robot Closer to Human Versatility: Hard Realtime Software Architecture and Deep Learning Based Tactile Sensing | ||||||||
Autoren: |
| ||||||||
Datum: | Januar 2019 | ||||||||
Referierte Publikation: | Ja | ||||||||
Open Access: | Nein | ||||||||
Status: | veröffentlicht | ||||||||
Stichwörter: | humanoid robots, versatility, Realtime Software Architecture, Deep Learning, Tactile Sensing, Justin | ||||||||
Institution: | Universitat Bremen | ||||||||
Abteilung: | Mathematik und Informatik | ||||||||
HGF - Forschungsbereich: | Luftfahrt, Raumfahrt und Verkehr | ||||||||
HGF - Programm: | Raumfahrt | ||||||||
HGF - Programmthema: | Robotik | ||||||||
DLR - Schwerpunkt: | Raumfahrt | ||||||||
DLR - Forschungsgebiet: | R RO - Robotik | ||||||||
DLR - Teilgebiet (Projekt, Vorhaben): | R - Intuitive Mensch-Roboter Schnittstelle [RO] | ||||||||
Standort: | Oberpfaffenhofen | ||||||||
Institute & Einrichtungen: | Institut für Robotik und Mechatronik (ab 2013) | ||||||||
Hinterlegt von: | Geyer, Günther | ||||||||
Hinterlegt am: | 24 Mär 2022 14:15 | ||||||||
Letzte Änderung: | 24 Mär 2022 14:15 |
Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags