Datcu, Mihai und Andrei, Vlad und Dumitru, Corneliu Octavian und Huang, Zhongling und Schwarz, Gottfried und Zhao, Juanping (2019) Explainable Deep Learning for SAR Data. Φ-week, 2019-09-09 - 2019-09-13, Frascati, Italy.
Dieses Archiv kann nicht den Volltext zur Verfügung stellen.
Offizielle URL: https://phiweek.esa.int/NikalWebsitePortal/esa-eo-phi-week-2019/phiweek/Speaker
Kurzfassung
When understanding the single polarization SAR images with deep learning, the texture features are usually learned automatically from the intensity. As an active microwave imaging, however, the complex Synthetic Aperture Radar (SAR) images not only contain the amplitude, but also the phase information, which is important and useful for interpretation. The time-frequency analysis (TFA) provides a physical understanding of the backscattering properties for each pixel in complex SAR images. As a consequence, a novel end-to-end deep learning framework to make the best use of both the physical properties of the objects and the spatial texture of the images is proposed. We start with a convolutional auto-encoder to learn the frequency features from each sub-spectrogram obtained by TFA, and then align them spatially. Next, the spatially aligned features in frequency domain and the low-level texture features obtained from a pre-trained SAR specific network in spatial domain are concatenated as the input of a post-processing residual network to learn spatial-frequency joint knowledge. The experiments were done on a large number of TerraSAR-X images. The proposed framework keeps the full information of complex-value SAR images, making a significant improvement compared with other spatial based deep learning methods in SAR image interpretation. In order to learn the latent space that governs the backscatter values in SAR-imagery we explored the dimensionality reduction properties of variational auto-encoders (VAE). By taking both channels of the SAR data as input and mapping them to a compact, lower-dimensional representation, we constructed a single feature-vector consisting of the parameters of the latent space. This information was then fed to a classifier such as k-NN or SVM (Support Vector machine). Experiments on Sentinel-1 GRDH data using VV/VH polarizations showcased the capability of this method to extract the relevant features of the images, achieving an average precision/recall in the case of k-NN of 0.97 and 0.96, respectively. Extracting physical scattering signatures from non-full-polarimetric images is of significant importance, but very challengeable. To achieve this goal and meanwhile exploring potentials of polarimetric SAR (PolSAR) images with different polarization modes and their combinations on this task, we proposed a contrastive regulated convolutional neural network (CNN) in complex domain. This method is to learn a physical-interpretable deep learning model from original scattering matrixes. The ground-truth is computed automatically by leveraging the Cloude and Pottier’s H-α division plane, which leads this work to an unsupervised learning mechanism. Considering the confused division boundary, a contrastive regulated term is computed in complex domain and added to the selected optimal loss function with a balancing trade-off coefficient. Experiments on DLR’s airborne, L-band F-SAR image demonstrate the feasibility of extracting physical scattering signatures from non-full-polarimetric SAR images. Moreover, the capabilities of different polarized images for achieving this are comprehensively analyzed and discussed.
elib-URL des Eintrags: | https://elib.dlr.de/130275/ | ||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dokumentart: | Konferenzbeitrag (Vortrag) | ||||||||||||||||||||||||||||
Titel: | Explainable Deep Learning for SAR Data | ||||||||||||||||||||||||||||
Autoren: |
| ||||||||||||||||||||||||||||
Datum: | September 2019 | ||||||||||||||||||||||||||||
Referierte Publikation: | Nein | ||||||||||||||||||||||||||||
Open Access: | Nein | ||||||||||||||||||||||||||||
Gold Open Access: | Nein | ||||||||||||||||||||||||||||
In SCOPUS: | Nein | ||||||||||||||||||||||||||||
In ISI Web of Science: | Nein | ||||||||||||||||||||||||||||
Status: | veröffentlicht | ||||||||||||||||||||||||||||
Stichwörter: | Deep Neural Network, Synthetic Aparture Radar | ||||||||||||||||||||||||||||
Veranstaltungstitel: | Φ-week | ||||||||||||||||||||||||||||
Veranstaltungsort: | Frascati, Italy | ||||||||||||||||||||||||||||
Veranstaltungsart: | internationale Konferenz | ||||||||||||||||||||||||||||
Veranstaltungsbeginn: | 9 September 2019 | ||||||||||||||||||||||||||||
Veranstaltungsende: | 13 September 2019 | ||||||||||||||||||||||||||||
HGF - Forschungsbereich: | Luftfahrt, Raumfahrt und Verkehr | ||||||||||||||||||||||||||||
HGF - Programm: | Raumfahrt | ||||||||||||||||||||||||||||
HGF - Programmthema: | Erdbeobachtung | ||||||||||||||||||||||||||||
DLR - Schwerpunkt: | Raumfahrt | ||||||||||||||||||||||||||||
DLR - Forschungsgebiet: | R EO - Erdbeobachtung | ||||||||||||||||||||||||||||
DLR - Teilgebiet (Projekt, Vorhaben): | R - Vorhaben hochauflösende Fernerkundungsverfahren (alt) | ||||||||||||||||||||||||||||
Standort: | Oberpfaffenhofen | ||||||||||||||||||||||||||||
Institute & Einrichtungen: | Institut für Methodik der Fernerkundung > EO Data Science | ||||||||||||||||||||||||||||
Hinterlegt von: | Karmakar, Chandrabali | ||||||||||||||||||||||||||||
Hinterlegt am: | 02 Dez 2019 13:27 | ||||||||||||||||||||||||||||
Letzte Änderung: | 24 Apr 2024 20:33 |
Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags