elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Learning Spectral-Spatial-Temporal Features via a Recurrent Convolutional Neural Network for Change Detection in Multispectral Imagery

Mou, Lichao und Bruzzone, Lorenzo und Zhu, Xiao Xiang (2019) Learning Spectral-Spatial-Temporal Features via a Recurrent Convolutional Neural Network for Change Detection in Multispectral Imagery. IEEE Transactions on Geoscience and Remote Sensing, 57 (2), Seiten 924-935. IEEE - Institute of Electrical and Electronics Engineers. doi: 10.1109/TGRS.2018.2863224. ISSN 0196-2892.

Dieses Archiv kann nicht den Volltext zur Verfügung stellen.

Offizielle URL: https://ieeexplore.ieee.org/document/8541102

Kurzfassung

Change detection is one of the central problems in earth observation and was extensively investigated over recent decades. In this paper, we propose a novel recurrent convolutional neural network (ReCNN) architecture, which is trained to learn a joint spectral-spatial-temporal feature representation in a unified framework for change detection in multispectral images. To this end, we bring together a convolutional neural network (CNN) and a recurrent neural network (RNN) into one end-to-end network. The former is able to generate rich spectral-spatial feature representations, while the latter effectively analyzes temporal dependency in bi-temporal images. In comparison with previous approaches to change detection, the proposed network architecture possesses three distinctive properties: 1) It is end-to-end trainable, in contrast to most existing methods whose components are separately trained or computed; 2) it naturally harnesses spatial information that has been proven to be beneficial to change detection task; 3) it is capable of adaptively learning the temporal dependency between multitemporal images, unlike most of algorithms that use fairly simple operation like image differencing or stacking. As far as we know, this is the first time that a recurrent convolutional network architecture has been proposed for multitemporal remote sensing image analysis. The proposed network is validated on real multispectral data sets. Both visual and quantitative analysis of experimental results demonstrates competitive performance in the proposed mode.

elib-URL des Eintrags:https://elib.dlr.de/120596/
Dokumentart:Zeitschriftenbeitrag
Titel:Learning Spectral-Spatial-Temporal Features via a Recurrent Convolutional Neural Network for Change Detection in Multispectral Imagery
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Mou, Lichaolichao.mou (at) dlr.deNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Bruzzone, LorenzoUniversity of TrentoNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Zhu, Xiao XiangDLR-IMF/TUM-LMFNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Datum:Februar 2019
Erschienen in:IEEE Transactions on Geoscience and Remote Sensing
Referierte Publikation:Ja
Open Access:Nein
Gold Open Access:Nein
In SCOPUS:Ja
In ISI Web of Science:Ja
Band:57
DOI:10.1109/TGRS.2018.2863224
Seitenbereich:Seiten 924-935
Verlag:IEEE - Institute of Electrical and Electronics Engineers
ISSN:0196-2892
Status:veröffentlicht
Stichwörter:Change detection, multitemporal image analysis, recurrent convolutional neural network (ReCNN), long short-term memory (LSTM)
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Erdbeobachtung
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R EO - Erdbeobachtung
DLR - Teilgebiet (Projekt, Vorhaben):R - Vorhaben hochauflösende Fernerkundungsverfahren (alt)
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Methodik der Fernerkundung > EO Data Science
Hinterlegt von: Mou, LiChao
Hinterlegt am:22 Jun 2018 12:24
Letzte Änderung:08 Nov 2023 10:18

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.