elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Barrierefreiheit | Kontakt | English
Schriftgröße: [-] Text [+]

Learning and designing shared control skills from demonstrations for assistive robots

Quere, Gabriel (2024) Learning and designing shared control skills from demonstrations for assistive robots. Dissertation, Institut Polytechnique de Paris.

Dieses Archiv kann nicht den Volltext zur Verfügung stellen.

Kurzfassung

Assistive robots, among which wheelchair-mounted robotic arms, hold great potential in supporting individuals with limited physical abilities by helping them interact with their environment. These robots can enhance users' autonomy by assisting with daily activities such as eating, drinking, or opening doors. However, controlling such robotic arms through accessible interfaces (like joysticks, sip-and-puff devices or buttons) can be challenging, as these interfaces are lower dimensional than the control space of the robot. Although autonomous task execution is an active area of research, involving humans in the control loop increases users' agency, leverages their situational awareness and improves the robustness of the system. Therefore, to operate these systems more easily, the development of intuitive controls and effective user interfaces is essential. This thesis introduces a new framework called "Shared Control Templates" which provide task-specific assistance through shared control. It explores methods for designing shared control skills that can reliably assist users in completing tasks successfully, whilst ensuring ease of use and user control of key motions. For instance, the user will control the quantity of liquid that is to be poured when preparing a drink, or the appropriate opening when pulling out a drawer. To achieve this, task-specific skills -- acting in task space -- are represented as finite-state machines, with transitions triggered by factors such as distances, wrench generated from environmental contact, or user triggers. These skills comprise of two key components: input mappings and active constraints. Input mappings define how user commands translate into robot motions, while active constraints enforce geometric limits on the robot’s end-effector task space, guiding the user and maintaining safe operation. For example, when pouring liquid from a bottle, the robot ensures no spilling by controlling the bottle’s position and partial orientation, while the user determines how much liquid is poured by controlling the tilt angle. To facilitate the design of such shared control skills, this research explores semi-automatic learning of these skills from demonstrated end-effector trajectories. Geometric shapes in Euclidean space are used as constraints in tasks such as opening drawers or cabinet doors. A library of pre-existing skills is then leveraged to accelerate the design of new skills by the skill designer. A probabilistic model -- Kernelized Movement Primitives -- is investigated to enable the derivation of input mappings and active constraints. This model additionally allows the adaptation of skills based on user input, enhancing both the design and execution phases. This shared control method is integrated with an assistive robot with a world model, user interfaces and a whole-body coordination of the entire system. It enables able-bodied and motor-impaired people to accomplish sequences of activities of daily living, in various settings such as a DLR-RMC laboratory, participants homes or the 2023 CYBATHLON Challenges and 2024 CYBATHLON.

elib-URL des Eintrags:https://elib.dlr.de/213091/
Dokumentart:Hochschulschrift (Dissertation)
Titel:Learning and designing shared control skills from demonstrations for assistive robots
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Quere, GabrielGabriel.Quere (at) dlr.dehttps://orcid.org/0000-0002-1788-3685NICHT SPEZIFIZIERT
Datum:Dezember 2024
Erschienen in:Learning and designing shared control skills from demonstrations for assistive robots
Open Access:Nein
Seitenanzahl:135
Status:im Druck
Stichwörter:Human-robot interaction, assistive robots, shared control, learning from demonstrations
Institution:Institut Polytechnique de Paris
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Robotik
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R RO - Robotik
DLR - Teilgebiet (Projekt, Vorhaben):R - Terrestrische Assistenz-Robotik
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Robotik und Mechatronik (ab 2013) > Kognitive Robotik
Hinterlegt von: Quere, Gabriel
Hinterlegt am:06 Mär 2025 10:36
Letzte Änderung:06 Mär 2025 10:36

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
OpenAIRE Validator logo electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.