DLR-Logo -> http://www.dlr.de
DLR Portal Home | Imprint | Privacy Policy | Contact | Deutsch
Fontsize: [-] Text [+]

Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions

Ohneiser, Oliver and Adamala, Jyothsna and Salomea, Ioan-Teodor (2021) Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions. Aerospace, 8 (245). Multidisciplinary Digital Publishing Institute (MDPI). doi: 10.3390/aerospace8090245. ISSN 2226-4310.

Full text not available from this repository.

Official URL: https://www.mdpi.com/2226-4310/8/9/245


Assistant based speech recognition (ABSR) prototypes for air traffic controllers have demonstrated to reduce controller workload and aircraft flight times as a result. However, two aspects of ABSR could enhance benefits, i.e., (1) the predicted controller commands that speech recognition engines use can be more accurate, and (2) the confirmation process of ABSR recognition output, such as callsigns, command types, and values by the controller, can be less intrusive. Both tasks can be supported by unobtrusive eye- and mouse-tracking when using operators' gaze and interaction data. First, probabilities for predicted commands should consider controllers' visual focus on the situation data display. Controllers will more likely give commands to aircraft that they focus on or where there was a mouse interaction on the display. Furthermore, they will more likely give certain command types depending on the characteristics of multiple aircraft being scanned. Second, it can be determined via eye-tracking instead of additional mouse clicks if the displayed ABSR output has been checked by the controller and remains uncorrected for a certain amount of time. Then, the output is assumed to be correct and is usable by other air traffic control systems, e.g., short-term conflict alert. If the ABSR output remains unchecked, an attention guidance functionality triggers different escalation levels to display visual cues. In a one-shot experimental case study with two controllers for the two implemented techniques, (1) command prediction probabilities improved by a factor of four, (2) prediction error rates based on an accuracy metric for three most-probable aircraft decreased by a factor of 25 when combining eye- and mouse-tracking data, and (3) visual confirmation of ABSR output promises to be an alternative for manual confirmation.

Item URL in elib:https://elib.dlr.de/143892/
Document Type:Article
Title:Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions
AuthorsInstitution or Email of AuthorsAuthor's ORCID iDORCID Put Code
Ohneiser, OliverUNSPECIFIEDhttps://orcid.org/0000-0002-5411-691XUNSPECIFIED
Date:3 September 2021
Journal or Publication Title:Aerospace
Refereed publication:Yes
Open Access:Yes
Gold Open Access:Yes
In ISI Web of Science:Yes
Publisher:Multidisciplinary Digital Publishing Institute (MDPI)
Series Name:MDPI
Keywords:air traffic controller; human machine interaction; multimodality; eye-tracking; mousetracking; automatic speech recognition; controller command prediction; attention guidance
HGF - Research field:Aeronautics, Space and Transport
HGF - Program:Aeronautics
HGF - Program Themes:Air Transportation and Impact
DLR - Research area:Aeronautics
DLR - Program:L AI - Air Transportation and Impact
DLR - Research theme (Project):L - Human Factors
Location: Braunschweig
Institutes and Institutions:Institute of Flight Guidance > Controller Assistance
Deposited By: Ohneiser, Oliver
Deposited On:14 Sep 2021 10:38
Last Modified:14 Sep 2021 10:38

Repository Staff Only: item control page

Help & Contact
electronic library is running on EPrints 3.3.12
Website and database design: Copyright © German Aerospace Center (DLR). All rights reserved.