DLR-Logo -> http://www.dlr.de
DLR Portal Home | Imprint | Privacy Policy | Contact | Deutsch
Fontsize: [-] Text [+]

Coregistration refinement of hyperspectral images and DSM: An object-based approach using spectral information

Avbelj, Janja and Iwaszczuk, Dorota and Müller, Rupert and Reinartz, Peter and Stilla, Uwe (2015) Coregistration refinement of hyperspectral images and DSM: An object-based approach using spectral information. ISPRS Journal of Photogrammetry and Remote Sensing, 100, pp. 23-34. Elsevier. DOI: 10.1016/j.isprsjprs.2014.05.010 ISSN 0924-2716

[img] PDF

Official URL: http://www.sciencedirect.com/science/article/pii/S0924271614001282


For image fusion in remote sensing applications the georeferencing accuracy using position, attitude, and camera calibration measurements can be insufficient. Thus, image processing techniques should be employed for precise coregistration of images. In this article a method for multimodal object-based image coregistration refinement between hyperspectral images (HSI) and digital surface models (DSM) is presented. The method is divided in three parts: object outline detection in HSI and DSM, matching, and determination of transformation parameters. The novelty of our proposed coregistration refinement method is the use of material properties and height information of urban objects from HSI and DSM, respectively. We refer to urban objects as objects which are typical in urban environments and focus on buildings by describing them with 2D outlines. Furthermore, the geometric accuracy of these detected building outlines is taken into account in the matching step and for the determination of transformation parameters. Hence, a stochastic model is introduced to compute optimal transformation parameters. The feasibility of the method is shown by testing it on two aerial HSI of different spatial and spectral resolution, and two DSM of different spatial resolution. The evaluation is carried out by comparing the accuracies of the transformations parameters to the reference parameters, determined by considering object outlines at much higher resolution, and also by computing the correctness and the quality rate of the extracted outlines before and after coregistration refinement. Results indicate that using outlines of objects instead of only line segments is advantageous for coregistration of HSI and DSM. The extraction of building outlines in comparison to the line cue extraction provides a larger amount of assigned lines between the images and is more robust to outliers, i.e. false matches.

Item URL in elib:https://elib.dlr.de/90424/
Document Type:Article
Title:Coregistration refinement of hyperspectral images and DSM: An object-based approach using spectral information
AuthorsInstitution or Email of AuthorsAuthors ORCID iD
Avbelj, Janjajanja.avbelj (at) dlr.deUNSPECIFIED
Iwaszczuk, Dorotaiwaszczuk (at) bv.tum.deUNSPECIFIED
Müller, Rupertrupert.mueller (at) dlr.deUNSPECIFIED
Reinartz, Peterpeter.reinartz (at) dlr.deUNSPECIFIED
Stilla, Uwestilla (at) tum.deUNSPECIFIED
Date:February 2015
Journal or Publication Title:ISPRS Journal of Photogrammetry and Remote Sensing
Refereed publication:Yes
Open Access:Yes
Gold Open Access:No
In ISI Web of Science:Yes
DOI :10.1016/j.isprsjprs.2014.05.010
Page Range:pp. 23-34
Lichti, D.University of Calgary, Calgary, Alberta, Canada
Keywords:Hyper spectral DEM/DTM Registration Multisensor Urban Matching
HGF - Research field:Aeronautics, Space and Transport
HGF - Program:Space
HGF - Program Themes:Earth Observation
DLR - Research area:Raumfahrt
DLR - Program:R EO - Erdbeobachtung
DLR - Research theme (Project):R - Vorhaben hochauflösende Fernerkundungsverfahren
Location: Oberpfaffenhofen
Institutes and Institutions:Remote Sensing Technology Institute > Photogrammetry and Image Analysis
Deposited By: Avbelj, Janja
Deposited On:29 Aug 2014 12:13
Last Modified:06 Sep 2019 15:28

Repository Staff Only: item control page

Help & Contact
electronic library is running on EPrints 3.3.12
Copyright © 2008-2017 German Aerospace Center (DLR). All rights reserved.