elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Operationalizing AI explainability using interpretability cues in the cockpit: Insights from User-Centered Development of the Intelligent Pilot Advisory System

Würfel, Jakob und Papenfuß, Anne und Wies, Matthias (2024) Operationalizing AI explainability using interpretability cues in the cockpit: Insights from User-Centered Development of the Intelligent Pilot Advisory System. Springer, Cham. 26TH INTERNATIONAL CONFERENCE ON HUMAN-COMPUTER INTERACTION, 2024-06-29 - 2024-07-04, Washington, USA. doi: 10.1007/978-3-031-60606-9_17. ISBN 978-3-031-60606-9.

[img] PDF - Nur DLR-intern zugänglich
504kB

Offizielle URL: https://link.springer.com/chapter/10.1007/978-3-031-60606-9_17#citeas

Kurzfassung

This paper presents a concept for operationalizing Artificial Intelligence (AI) explainability for the Intelligent Pilot Advisory System (IPAS) as requested in the European Aviation Safety Agency’s AI Roadmap 2.0 in order to meet the requirement of Trustworthy AI. The IPAS is currently being developed to provide AI-based decision support in commercial aircraft to assist the flight crew, especially in emergency situations. The development of the IPAS is following a user-centred and exploratory design approach, with the active involvement of airline pilots in the early stages of development to iteratively tailor the system to their requirements. The concept presented in this paper aims to provide interpretability cues to achieve “operational explainability of AI”, which should enable commercial aircraft pilots to understand and adequately trust the recommendations generated by AI when making decisions in emergencies. Focus of the research was to identify initial interpretability requirements and to answer the question of what interpretation cues pilots need from the AI-based system. Based on a user study with airline pilots, four requirements for interpretation cues were formulated. These results will form the basis for the next iteration of the IPAS, where the requirements will be implemented.

elib-URL des Eintrags:https://elib.dlr.de/201930/
Dokumentart:Konferenzbeitrag (Vortrag)
Titel:Operationalizing AI explainability using interpretability cues in the cockpit: Insights from User-Centered Development of the Intelligent Pilot Advisory System
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Würfel, JakobJakob.Wuerfel (at) dlr.dehttps://orcid.org/0009-0009-0231-1092161423161
Papenfuß, AnneAnne.Papenfuss (at) dlr.dehttps://orcid.org/0000-0002-0686-7006161423162
Wies, MatthiasMatthias.Wies (at) dlr.dehttps://orcid.org/0000-0001-6514-3211161423163
Datum:1 Juni 2024
Referierte Publikation:Ja
Open Access:Nein
Gold Open Access:Nein
In SCOPUS:Nein
In ISI Web of Science:Nein
Band:14734
DOI:10.1007/978-3-031-60606-9_17
Seitenbereich:Seiten 297-313
Verlag:Springer, Cham
Name der Reihe:Lecture Notes in Computer Science
ISBN:978-3-031-60606-9
Status:veröffentlicht
Stichwörter:Ethical and trustworthy AI, Human-Centered AI, Human-AI Teaming, Explainable AI, Interpretable AI
Veranstaltungstitel:26TH INTERNATIONAL CONFERENCE ON HUMAN-COMPUTER INTERACTION
Veranstaltungsort:Washington, USA
Veranstaltungsart:internationale Konferenz
Veranstaltungsbeginn:29 Juni 2024
Veranstaltungsende:4 Juli 2024
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Luftfahrt
HGF - Programmthema:Luftverkehr und Auswirkungen
DLR - Schwerpunkt:Luftfahrt
DLR - Forschungsgebiet:L AI - Luftverkehr und Auswirkungen
DLR - Teilgebiet (Projekt, Vorhaben):L - Faktor Mensch
Standort: Braunschweig
Institute & Einrichtungen:Institut für Flugführung > Systemergonomie
Hinterlegt von: Würfel, Jakob
Hinterlegt am:12 Jun 2024 08:30
Letzte Änderung:14 Jun 2024 10:53

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.