elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

A Newly Developed Algorithm for Cloud Shadow Detection - TIP Method

Zekoll, Viktoria und de los Reyes, Raquel und Richter, Rudolf (2022) A Newly Developed Algorithm for Cloud Shadow Detection - TIP Method. Remote Sensing, 14 (2922), Seiten 1-15. Multidisciplinary Digital Publishing Institute (MDPI). doi: 10.3390/rs14122922. ISSN 2072-4292.

[img] PDF - Verlagsversion (veröffentlichte Fassung)
7MB

Offizielle URL: https://mdpi-res.com/d_attachment/remotesensing/remotesensing-14-02922/article_deploy/remotesensing-14-02922-v2.pdf?version=1655716583

Kurzfassung

The masking of cloud shadows in optical satellite imagery is an important step in automated processing chains. A new method (the TIP method) for cloud shadow detection in multispectral satellite images is presented and compared to current methods. The TIP method is based on the evaluation of thresholds, indices and projections. Most state-of-the-art methods solemnly rely on one of these evaluation steps or on a complex working mechanism. Instead, the new method incorporates three basic evaluation steps into one algorithm for easy and accurate cloud shadow detection. Furthermore the performance of the masking algorithms provided by the software packages ATCOR (“Atmospheric Correction”) and PACO (“Python-based Atmospheric Correction”) is compared with that of the newly implemented TIP method on a set of 20 Sentinel-2 scenes distributed over the globe, covering a wide variety of environments and climates. The algorithms incorporated in each piece of masking software use the class of cloud shadows, but they employ different rules and class-specific thresholds. Classification results are compared to the assessment of an expert human interpreter. The class assignment of the human interpreter is considered as reference or “truth”. The overall accuracies for the class cloud shadows of ATCOR and PACO (including TIP) for difference areas of the selected scenes are 70.4% and 76.6% respectively. The difference area encompasses the parts of the classification image where the classification maps disagree. User and producer accuracies for the class cloud shadow are strongly scene-dependent, typically varying between 45% and 95%. The experimental results show that the proposed TIP method based on thresholds, indices and projections can obtain improved cloud shadow detection performance.

elib-URL des Eintrags:https://elib.dlr.de/192992/
Dokumentart:Zeitschriftenbeitrag
Titel:A Newly Developed Algorithm for Cloud Shadow Detection - TIP Method
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Zekoll, ViktoriaViktoria.Zekoll (at) dlr.deNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
de los Reyes, RaquelRaquel.delosReyes (at) dlr.dehttps://orcid.org/0000-0003-0485-9552NICHT SPEZIFIZIERT
Richter, RudolfRudolf.Richter (at) dlr.deNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Datum:Juni 2022
Erschienen in:Remote Sensing
Referierte Publikation:Ja
Open Access:Ja
Gold Open Access:Ja
In SCOPUS:Ja
In ISI Web of Science:Ja
Band:14
DOI:10.3390/rs14122922
Seitenbereich:Seiten 1-15
Verlag:Multidisciplinary Digital Publishing Institute (MDPI)
ISSN:2072-4292
Status:veröffentlicht
Stichwörter:Sentinel-2; cloud shadow masking; TIP method; PACO; ATCOR
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Erdbeobachtung
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R EO - Erdbeobachtung
DLR - Teilgebiet (Projekt, Vorhaben):R - Optische Fernerkundung
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Methodik der Fernerkundung > Photogrammetrie und Bildanalyse
Hinterlegt von: Knickl, Sabine
Hinterlegt am:12 Jan 2023 17:29
Letzte Änderung:19 Okt 2023 12:51

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.