elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Automatic feature-based geometric fusion of multi-view TomoSAR point clouds in urban area

Wang, Yuanyuan und Zhu, Xiao Xiang (2015) Automatic feature-based geometric fusion of multi-view TomoSAR point clouds in urban area. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 8 (3), Seiten 953-965. IEEE - Institute of Electrical and Electronics Engineers. doi: 10.1109/JSTARS.2014.2361430. ISSN 1939-1404.

[img] PDF
3MB

Offizielle URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=6942160

Kurzfassung

Interferometric synthetic aperture radar (InSAR) techniques, such as persistent scatterer interferometry (PSI) or SAR tomography (TomoSAR), deliver three-dimensional (3-D) point clouds of the scatterers’ positions together with their motion information relative to a reference point. Due to the SAR sidelooking geometry, minimum of two point clouds from crossheading orbits, i.e., ascending and descending, are required to achieve a complete monitoring over an urban area. However, these two point clouds are usually not coregistered due to their different reference points with unknown 3-D positions. In general, no exact identical points from the same physical object can be found in such two point clouds. This article describes a robust algorithm for fusing such two point clouds of urban areas. The contribution of this paper is finding the theoretically exact point correspondence, which is the end positions of façades, where the two point clouds close. We explicitly define this algorithm as “L-shape detection and matching,” in this paper, because the façades commonly appear as L-shapes in InSAR point cloud. This algorithm introduces a few important features for a reliable result, including point density estimation using adaptive directional window for better façade points detection and L-shape extraction using weighed Hough transform. The algorithm is fully automatic. Its accuracy is evaluated using simulated data. Furthermore, the proposed method is applied on two TomoSAR point clouds over Berlin with ascending and descending geometry. The result is compared with the first PSI point cloud fusion method (S. Gernhardt and R. Bamler, “Deformation monitoring of single buildings using meter-resolution SAR data in PSI,” ISPRS J. Photogramm. Remote Sens., vol. 73, pp. 68–79, 2012.) for urban area. Submeter consistency is achieved.

elib-URL des Eintrags:https://elib.dlr.de/93032/
Dokumentart:Zeitschriftenbeitrag
Titel:Automatic feature-based geometric fusion of multi-view TomoSAR point clouds in urban area
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Wang, YuanyuanTUMNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Zhu, Xiao XiangDLR-IMF/TUM-LMFNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Datum:2015
Erschienen in:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Referierte Publikation:Ja
Open Access:Ja
Gold Open Access:Nein
In SCOPUS:Ja
In ISI Web of Science:Ja
Band:8
DOI:10.1109/JSTARS.2014.2361430
Seitenbereich:Seiten 953-965
Verlag:IEEE - Institute of Electrical and Electronics Engineers
ISSN:1939-1404
Status:veröffentlicht
Stichwörter:point cloud fusion, SAR tomography, SAR, TerraSAR-X
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Erdbeobachtung
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R EO - Erdbeobachtung
DLR - Teilgebiet (Projekt, Vorhaben):R - Vorhaben hochauflösende Fernerkundungsverfahren (alt)
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Methodik der Fernerkundung > SAR-Signalverarbeitung
Hinterlegt von: Wang, Yuanyuan
Hinterlegt am:03 Dez 2014 18:17
Letzte Änderung:19 Nov 2021 20:28

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.