elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Gamma-Net: Superresolving SAR Tomographic Inversion via Deep Learning

Qian, Kun und Wang, Yuanyuan und Shi, Yilei und Zhu, Xiao Xiang (2022) Gamma-Net: Superresolving SAR Tomographic Inversion via Deep Learning. IEEE Transactions on Geoscience and Remote Sensing, 60, Seiten 1-16. IEEE - Institute of Electrical and Electronics Engineers. doi: 10.1109/TGRS.2022.3164193. ISSN 0196-2892.

[img] PDF - Verlagsversion (veröffentlichte Fassung)
10MB

Kurzfassung

Synthetic aperture radar tomography (TomoSAR) has been extensively employed in 3-D reconstruction in dense urban areas using high-resolution SAR acquisitions. Compressive sensing (CS)-based algorithms are generally considered as the state-of-the art in super-resolving TomoSAR, in particular in the single look case. This superior performance comes at the cost of extra computational burdens, because of the sparse reconstruction, which cannot be solved analytically, and we need to employ computationally expensive iterative solvers. In this article, we propose a novel deep learning-based super-resolving TomoSAR inversion approach, γ-Net, to tackle this challenge. γ-Net adopts advanced complex-valued learned iterative shrinkage thresholding algorithm (CV-LISTA) to mimic the iterative optimization step in sparse reconstruction. Simulations show the height estimate from a well-trained γ-Net approaches the Cramér-Rao lower bound (CRLB) while improving the computational efficiency by one to two orders of magnitude comparing to the first-order CS-based methods. It also shows no degradation in the super-resolution power comparing to the state-of-the-art second-order TomoSAR solvers, which are much more computationally expensive than the first-order methods. Specifically, γ-Net reaches more than 90% detection rate in moderate super-resolving cases at 25 measurements at 6 dB SNR. Moreover, simulation at limited baselines demonstrates that the proposed algorithm outperforms the second-order CS-based method by a fair margin. Test on real TanDEM-X data with just six interferograms also shows high-quality 3-D reconstruction with high-density detected double scatterers.

elib-URL des Eintrags:https://elib.dlr.de/187292/
Dokumentart:Zeitschriftenbeitrag
Titel:Gamma-Net: Superresolving SAR Tomographic Inversion via Deep Learning
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Qian, Kunkun.qian (at) dlr.deNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Wang, Yuanyuanyuanyuan.wang (at) dlr.deNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Shi, Yileiyilei.shi (at) tum.deNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Zhu, Xiao Xiangxiaoxiang.zhu (at) dlr.dehttps://orcid.org/0000-0001-5530-3613NICHT SPEZIFIZIERT
Datum:April 2022
Erschienen in:IEEE Transactions on Geoscience and Remote Sensing
Referierte Publikation:Ja
Open Access:Ja
Gold Open Access:Nein
In SCOPUS:Ja
In ISI Web of Science:Ja
Band:60
DOI:10.1109/TGRS.2022.3164193
Seitenbereich:Seiten 1-16
Verlag:IEEE - Institute of Electrical and Electronics Engineers
ISSN:0196-2892
Status:veröffentlicht
Stichwörter:Complex-valued learned iterative shrinkage thresholding algorithm (LISTA), compressive sensing (CS), synthetic aperture radar (SAR) tomography (TomoSAR), super-resolution.
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Erdbeobachtung
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R EO - Erdbeobachtung
DLR - Teilgebiet (Projekt, Vorhaben):R - SAR-Methoden, R - Künstliche Intelligenz
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Methodik der Fernerkundung > EO Data Science
Hinterlegt von: Qian, Kun (Admin.), Funktional
Hinterlegt am:06 Jul 2022 14:14
Letzte Änderung:19 Okt 2023 13:29

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.