DLR-Logo -> http://www.dlr.de
DLR Portal Home | Imprint | Privacy Policy | Contact | Deutsch
Fontsize: [-] Text [+]

Artificial Intelligence for EO Sensor Fusion

Dumitru, Corneliu Octavian and Schwarz, Gottfried and Grivei, Alexandru and Datcu, Mihai (2019) Artificial Intelligence for EO Sensor Fusion. ESA Living Planet Symposium 2019, 13.-17. May 2019, Milan, Italy.

Full text not available from this repository.

Official URL: https://lps19.esa.int/NikalWebsitePortal/living-planet-symposium-2019/lps19


Currently, what exists in the field of data fusion is a collection of routines/algorithms that can be linked and embedded for various applications. A very well-known open-source toolbox is Orfeo which provides a large number of state-of-the-art algorithms to process SAR and multispectral images for different applications. Another one is Google Engine that includes a large image database and a number of algorithms (or you can add your algorithms) that can be used for image processing. An innovative system is CANDELA, an H2020 research and innovation project under grant agreement No. 776193, which has as one of its objectives the fusion of radar and multispectral images at semantic level but also at feature/descriptor level. The first results will be presented during the Living Planet Symposium. For this case, we propose to recognize different target area details in overlapping SAR and multispectral images complementing each other with rapid succession. For doing this, we already selected Sentinel-1 and Sentinel-2 images that can be rectified and co-aligned by publicly available toolbox routines offered by ESA allowing a straightforward image comparison. In addition, we propose data fusion, and interpretation. The most important aspects to be considered are: • The Sentinel-1 C-band constellation consisting of two radar satellites following each other along the same orbit delivers Earth surface images from their side-looking radar instruments taken during day or night. The images have a spatial resolution of typically 20x20 m. One can easily discriminate bright reflectors (e.g., edges of buildings) from dark surfaces (e.g., windless water surfaces). Brightness differences of Sentinel-1 pixels can be due to surface roughness or smoothness, edges of buildings, farming practices, etc. The main advantage of Sentinel-1 is that it offers virtually cloud-free images. • The Sentinel-2 twin satellite constellation is delivering Earth surface images from their multicolour nadir-looking optical imagers taken during daylight. These images have a spatial resolution of typically 10x10 m. Brightness differences among the colour bands can be analysed to discover land cover characteristics such as urbanisation, vegetation properties, or current sea surface dynamics. Infrared images can be used primarily for vegetation properties. In contrast, Sentinel-1, Sentinel-2 images are affected by clouds and other weather conditions. • The fusion of Sentinel-1 and Sentinel-2 data shall allow a joint interpretation of radar backscatter and optical reflectance data. As ESA has undertaken considerable effort in the absolute calibration of all Sentinel instruments, we expect the discovery of many hitherto unseen phenomena with well-defined confidence levels. While we are accustomed to image fusion as a radiometric combination of multispectral images, a comparably mature level of semantic fusion of SAR images has not been reached yet. In order to remedy the situation, we propose a semantic fusion concept for SAR images, where we will combine the semantic image content of two data sets with different characteristics (e.g., TerraSAR-X and Sentinel-1). In our case, we observed several coastal areas in Europe by a high- and a mid-resolution spaceborne instrument and combined their information content. By exploiting the specific imaging details and the retrievable semantic categories of the two image types, we obtained semantically-fused image classification maps that allow us to differentiate several coastal surface categories. In order to verify the classification results, we will compare SAR images to multispectral satellite images and in-situ data.

Item URL in elib:https://elib.dlr.de/130280/
Document Type:Conference or Workshop Item (Speech)
Title:Artificial Intelligence for EO Sensor Fusion
AuthorsInstitution or Email of AuthorsAuthors ORCID iD
Dumitru, Corneliu OctavianCorneliu.Dumitru (at) dlr.deUNSPECIFIED
Schwarz, GottfriedGottfried.Schwarz (at) dlr.deUNSPECIFIED
Grivei, AlexandruUniversity Politehnica of Bucharest, RomaniaUNSPECIFIED
Datcu, MihaiMihai.Datcu (at) dlr.deUNSPECIFIED
Date:May 2019
Refereed publication:No
Open Access:No
Gold Open Access:No
In ISI Web of Science:No
Keywords:Artificial Intelligence, Sensor Fusion, Earth Observation
Event Title:ESA Living Planet Symposium 2019
Event Location:Milan, Italy
Event Type:international Conference
Event Dates:13.-17. May 2019
HGF - Research field:Aeronautics, Space and Transport
HGF - Program:Space
HGF - Program Themes:Earth Observation
DLR - Research area:Raumfahrt
DLR - Program:R EO - Erdbeobachtung
DLR - Research theme (Project):R - Vorhaben hochauflösende Fernerkundungsverfahren
Location: Oberpfaffenhofen
Institutes and Institutions:Remote Sensing Technology Institute > EO Data Science
Deposited By: Karmakar, Chandrabali
Deposited On:03 Dec 2019 09:36
Last Modified:04 Dec 2019 13:22

Repository Staff Only: item control page

Help & Contact
electronic library is running on EPrints 3.3.12
Copyright © 2008-2017 German Aerospace Center (DLR). All rights reserved.