elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Barrierefreiheit | Kontakt | English
Schriftgröße: [-] Text [+]

Towards Explainable AI: Interpreting Soil Organic Carbon Prediction Models Using a Learning-Based Explanation Method

Kakhani, Nafiseh und Taghizadeh-Mehrjardi, Ruhollah und Omarzadeh, Davoud und Ryo, Masahiro und Heiden, Uta und Scholten, Thomas (2025) Towards Explainable AI: Interpreting Soil Organic Carbon Prediction Models Using a Learning-Based Explanation Method. European Journal of Soil Science, 76, Seiten 1-18. Wiley. doi: 10.1111/ejss.70071. ISSN 1351-0754.

[img] PDF - Verlagsversion (veröffentlichte Fassung)
4MB

Offizielle URL: https://bsssjournals.onlinelibrary.wiley.com/doi/full/10.1111/ejss.70071

Kurzfassung

An understanding of the key factors and processes influencing the variability of soil organic carbon (SOC) is essential for the development of effective policies aimed at enhancing carbon storage in soils to mitigate climate change. In recent years, complex computational approaches from the field of machine learning (ML) have been developed for modelling and mapping SOC in various ecosystems and over large areas. However, in order to understand the processes that account for SOC variability from ML models and to serve as a basis for new scientific discoveries, the predictions made by these data-driven models must be accurately explained and interpreted. In this research, we introduce a novel explanation approach applicable to any ML model and investigate the significance of environmental features to explain SOC variability across Germany. The methodology employed in this study involves training multiple ML models using SOC content measurements from the LUCAS dataset and incorporating environmental features derived from Google Earth Engine (GEE) as explanatory variables. Thereafter, an explanation model is applied to elucidate what the ML models have learned about the relationship between environmental features and SOC content in a supervised manner. In our approach, a post hoc model is trained to estimate the contribution of specific inputs to the outputs of the trained ML models. The results of this study indicate that different classes of ML models rely on interpretable but distinct environmental features to explain SOC variability. Decision tree-based models, such as random forest (RF) and gradient boosting, highlight the importance of topographic features. Conversely, soil chemical information, particularly pH, is crucial for the performance of neural networks and linear regression models. Therefore, interpreting data-driven studies requires a carefully structured approach, guided by expert knowledge and a deep understanding of the models being analysed.

elib-URL des Eintrags:https://elib.dlr.de/214333/
Dokumentart:Zeitschriftenbeitrag
Titel:Towards Explainable AI: Interpreting Soil Organic Carbon Prediction Models Using a Learning-Based Explanation Method
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Kakhani, Nafisehnafiseh.kakhani (at) uni-tuebingen.deNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Taghizadeh-Mehrjardi, RuhollahUniversity of TübingenNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Omarzadeh, DavoudUniversitat Oberta de Catalunya, Barcelona,NICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Ryo, MasahiroLeibniz Centre for Agricultural Landscape Research (ZALF), Müncheberg, GermanyNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Heiden, Utauta.heiden (at) dlr.dehttps://orcid.org/0000-0002-3865-1912NICHT SPEZIFIZIERT
Scholten, ThomasUniversity of TübingenNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Datum:Januar 2025
Erschienen in:European Journal of Soil Science
Referierte Publikation:Ja
Open Access:Ja
Gold Open Access:Nein
In SCOPUS:Ja
In ISI Web of Science:Ja
Band:76
DOI:10.1111/ejss.70071
Seitenbereich:Seiten 1-18
Verlag:Wiley
ISSN:1351-0754
Status:veröffentlicht
Stichwörter:explainable AI, Germany, Google Earth Engine, HLS product, remote sensing, soil organic carbon
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Erdbeobachtung
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R EO - Erdbeobachtung
DLR - Teilgebiet (Projekt, Vorhaben):R - Optische Fernerkundung
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Methodik der Fernerkundung > Abbildende Spektroskopie
Hinterlegt von: Heiden, Dr.rer.nat. Uta
Hinterlegt am:06 Jun 2025 11:21
Letzte Änderung:07 Aug 2025 14:28

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
OpenAIRE Validator logo electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.