Dumitru, Corneliu Octavian und Schwarz, Gottfried und Grivei, Alexandru und Datcu, Mihai (2019) Artificial Intelligence for EO Sensor Fusion. ESA Living Planet Symposium 2019, 2019-05-13 - 2019-05-17, Milan, Italy.
Dieses Archiv kann nicht den Volltext zur Verfügung stellen.
Offizielle URL: https://lps19.esa.int/NikalWebsitePortal/living-planet-symposium-2019/lps19
Kurzfassung
Currently, what exists in the field of data fusion is a collection of routines/algorithms that can be linked and embedded for various applications. A very well-known open-source toolbox is Orfeo which provides a large number of state-of-the-art algorithms to process SAR and multispectral images for different applications. Another one is Google Engine that includes a large image database and a number of algorithms (or you can add your algorithms) that can be used for image processing. An innovative system is CANDELA, an H2020 research and innovation project under grant agreement No. 776193, which has as one of its objectives the fusion of radar and multispectral images at semantic level but also at feature/descriptor level. The first results will be presented during the Living Planet Symposium. For this case, we propose to recognize different target area details in overlapping SAR and multispectral images complementing each other with rapid succession. For doing this, we already selected Sentinel-1 and Sentinel-2 images that can be rectified and co-aligned by publicly available toolbox routines offered by ESA allowing a straightforward image comparison. In addition, we propose data fusion, and interpretation. The most important aspects to be considered are: • The Sentinel-1 C-band constellation consisting of two radar satellites following each other along the same orbit delivers Earth surface images from their side-looking radar instruments taken during day or night. The images have a spatial resolution of typically 20x20 m. One can easily discriminate bright reflectors (e.g., edges of buildings) from dark surfaces (e.g., windless water surfaces). Brightness differences of Sentinel-1 pixels can be due to surface roughness or smoothness, edges of buildings, farming practices, etc. The main advantage of Sentinel-1 is that it offers virtually cloud-free images. • The Sentinel-2 twin satellite constellation is delivering Earth surface images from their multicolour nadir-looking optical imagers taken during daylight. These images have a spatial resolution of typically 10x10 m. Brightness differences among the colour bands can be analysed to discover land cover characteristics such as urbanisation, vegetation properties, or current sea surface dynamics. Infrared images can be used primarily for vegetation properties. In contrast, Sentinel-1, Sentinel-2 images are affected by clouds and other weather conditions. • The fusion of Sentinel-1 and Sentinel-2 data shall allow a joint interpretation of radar backscatter and optical reflectance data. As ESA has undertaken considerable effort in the absolute calibration of all Sentinel instruments, we expect the discovery of many hitherto unseen phenomena with well-defined confidence levels. While we are accustomed to image fusion as a radiometric combination of multispectral images, a comparably mature level of semantic fusion of SAR images has not been reached yet. In order to remedy the situation, we propose a semantic fusion concept for SAR images, where we will combine the semantic image content of two data sets with different characteristics (e.g., TerraSAR-X and Sentinel-1). In our case, we observed several coastal areas in Europe by a high- and a mid-resolution spaceborne instrument and combined their information content. By exploiting the specific imaging details and the retrievable semantic categories of the two image types, we obtained semantically-fused image classification maps that allow us to differentiate several coastal surface categories. In order to verify the classification results, we will compare SAR images to multispectral satellite images and in-situ data.
elib-URL des Eintrags: | https://elib.dlr.de/130280/ | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dokumentart: | Konferenzbeitrag (Vortrag) | ||||||||||||||||||||
Titel: | Artificial Intelligence for EO Sensor Fusion | ||||||||||||||||||||
Autoren: |
| ||||||||||||||||||||
Datum: | Mai 2019 | ||||||||||||||||||||
Referierte Publikation: | Nein | ||||||||||||||||||||
Open Access: | Nein | ||||||||||||||||||||
Gold Open Access: | Nein | ||||||||||||||||||||
In SCOPUS: | Nein | ||||||||||||||||||||
In ISI Web of Science: | Nein | ||||||||||||||||||||
Status: | veröffentlicht | ||||||||||||||||||||
Stichwörter: | Artificial Intelligence, Sensor Fusion, Earth Observation | ||||||||||||||||||||
Veranstaltungstitel: | ESA Living Planet Symposium 2019 | ||||||||||||||||||||
Veranstaltungsort: | Milan, Italy | ||||||||||||||||||||
Veranstaltungsart: | internationale Konferenz | ||||||||||||||||||||
Veranstaltungsbeginn: | 13 Mai 2019 | ||||||||||||||||||||
Veranstaltungsende: | 17 Mai 2019 | ||||||||||||||||||||
HGF - Forschungsbereich: | Luftfahrt, Raumfahrt und Verkehr | ||||||||||||||||||||
HGF - Programm: | Raumfahrt | ||||||||||||||||||||
HGF - Programmthema: | Erdbeobachtung | ||||||||||||||||||||
DLR - Schwerpunkt: | Raumfahrt | ||||||||||||||||||||
DLR - Forschungsgebiet: | R EO - Erdbeobachtung | ||||||||||||||||||||
DLR - Teilgebiet (Projekt, Vorhaben): | R - Vorhaben hochauflösende Fernerkundungsverfahren (alt) | ||||||||||||||||||||
Standort: | Oberpfaffenhofen | ||||||||||||||||||||
Institute & Einrichtungen: | Institut für Methodik der Fernerkundung > EO Data Science | ||||||||||||||||||||
Hinterlegt von: | Karmakar, Chandrabali | ||||||||||||||||||||
Hinterlegt am: | 03 Dez 2019 09:36 | ||||||||||||||||||||
Letzte Änderung: | 24 Apr 2024 20:33 |
Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags