Nissler, Christian und Mouriki, Nikoleta und Castellini, Claudio (2016) Optical Myography: Detecting Finger Movements by Looking at the Forearm. Frontiers in Neurorobotics. Frontiers Media S.A.. doi: 10.3389/fnbot.2016.00003. ISSN 1662-5218.
PDF
1MB |
Offizielle URL: http://journal.frontiersin.org/article/10.3389/fnbot.2016.00003/full
Kurzfassung
One of the crucial problems found in the scientific community of assistive/rehabilitation robotics nowadays is that of automatically detecting what a disabled subject (for instance, a hand amputee) wants to do, exactly when she wants to do it, and strictly for the time she wants to do it. This problem, commonly called “intent detection,” has traditionally been tackled using surface electromyography, a technique which suffers from a number of drawbacks, including the changes in the signal induced by sweat and muscle fatigue. With the advent of realistic, physically plausible augmented- and virtual-reality environments for rehabilitation, this approach does not suffice anymore. In this paper, we explore a novel method to solve the problem, which we call Optical Myography (OMG). The idea is to visually inspect the human forearm (or stump) to reconstruct what fingers are moving and to what extent. In a psychophysical experiment involving ten intact subjects, we used visual fiducial markers (AprilTags) and a standard web camera to visualize the deformations of the surface of the forearm, which then were mapped to the intended finger motions. As ground truth, a visual stimulus was used, avoiding the need for finger sensors (force/position sensors, datagloves, etc.). Two machine-learning approaches, a linear and a non-linear one, were comparatively tested in settings of increasing realism. The results indicate an average error in the range of 0.05–0.22 (root mean square error normalized over the signal range), in line with similar results obtained with more mature techniques such as electromyography. If further successfully tested in the large, this approach could lead to vision-based intent detection of amputees, with the main application of letting such disabled persons dexterously and reliably interact in an augmented-/virtual-reality setup.
elib-URL des Eintrags: | https://elib.dlr.de/105316/ | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dokumentart: | Zeitschriftenbeitrag | ||||||||||||||||
Titel: | Optical Myography: Detecting Finger Movements by Looking at the Forearm | ||||||||||||||||
Autoren: |
| ||||||||||||||||
Datum: | 11 April 2016 | ||||||||||||||||
Erschienen in: | Frontiers in Neurorobotics | ||||||||||||||||
Referierte Publikation: | Ja | ||||||||||||||||
Open Access: | Ja | ||||||||||||||||
Gold Open Access: | Ja | ||||||||||||||||
In SCOPUS: | Ja | ||||||||||||||||
In ISI Web of Science: | Ja | ||||||||||||||||
DOI: | 10.3389/fnbot.2016.00003 | ||||||||||||||||
Herausgeber: |
| ||||||||||||||||
Verlag: | Frontiers Media S.A. | ||||||||||||||||
ISSN: | 1662-5218 | ||||||||||||||||
Status: | veröffentlicht | ||||||||||||||||
Stichwörter: | rehabilitation robotics, human–machine interface, hand prostheses, computer vision, myography | ||||||||||||||||
HGF - Forschungsbereich: | Luftfahrt, Raumfahrt und Verkehr | ||||||||||||||||
HGF - Programm: | Raumfahrt | ||||||||||||||||
HGF - Programmthema: | Technik für Raumfahrtsysteme | ||||||||||||||||
DLR - Schwerpunkt: | Raumfahrt | ||||||||||||||||
DLR - Forschungsgebiet: | R SY - Technik für Raumfahrtsysteme | ||||||||||||||||
DLR - Teilgebiet (Projekt, Vorhaben): | R - Vorhaben Multisensorielle Weltmodellierung (alt) | ||||||||||||||||
Standort: | Oberpfaffenhofen | ||||||||||||||||
Institute & Einrichtungen: | Institut für Robotik und Mechatronik (ab 2013) > Perzeption und Kognition Institut für Robotik und Mechatronik (ab 2013) > Kognitive Robotik | ||||||||||||||||
Hinterlegt von: | Nissler, Christian | ||||||||||||||||
Hinterlegt am: | 19 Jul 2016 09:21 | ||||||||||||||||
Letzte Änderung: | 03 Nov 2023 07:51 |
Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags