elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Adversarial Occlusion Augmentation: Guided Occlusions for Improving Object Detector

Liu, Siyuan (2020) Adversarial Occlusion Augmentation: Guided Occlusions for Improving Object Detector. DLR-Interner Bericht. DLR-IB-RM-OP-2020-52. Masterarbeit. Technische Universität München. 92 S.

[img] PDF - Nur DLR-intern zugänglich
86MB

Kurzfassung

In recent years, deep-learned object detectors have achieved great success in the computer vision domain and reached promising results on benchmark datasets. However, in real-world applications (e.g. robotics) they still face problems, mainly due to the gap between the training data recorded under lab conditions and the complex and unknown test environments. Besides the well-known data thirst, one of the mentioned challenging problems is dealing with occlusions, which leads to missed and/or false detection and thus decreases the performance. Although there exist already some works on occlusion handling [5, 40], most of them focus on image classification and are evaluated on relatively simple datasets. In this work a novel Adversarial Occlusion Augmentation approach is proposed, focusing on the improvement of the object detector performance in highly cluttered environments, while at the same time keeping additional effort of data processing low. Inspired by Generative Adversarial Networks (GAN) [12] where both generator and discriminator can be optimized simultaneously, an adversarial neural network with a GAN-like architecture is designed to bring in guided occlusions on a synthetically generated dataset. A so-called Occlusion Generating Network (OccNet), representing the generator, and an object detector as discriminator are jointly trained in a competing manner. This enables the OccNet to learn to generate challenging samples due to occlusions, while the object detector is further improved by the enhanced training dataset. The occlusion types are modified step by step, from artificial to real-world occlusions. Integrated with the Dual Attention Mechanism [7], OccNet is able to search the optimal occlusion areas in a more efficient way. Furthermore, due to the jointly training strategy, the generated occlusions change from simple to hard which enables the detector to adapt itself epoch-wise. Therefore, the overall detection performance of an off-the-shelf object detector can be improved accordingly by learning from such an easy-to-hard occlusion schema. Experiments on the T-LESS Dataset [15] prove that the adversarial occlusion augmentation approach can effectively improve the performance of a deep object detector without additional data effort.

elib-URL des Eintrags:https://elib.dlr.de/139117/
Dokumentart:Berichtsreihe (DLR-Interner Bericht, Masterarbeit)
Titel:Adversarial Occlusion Augmentation: Guided Occlusions for Improving Object Detector
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Liu, SiyuanNICHT SPEZIFIZIERTNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Datum:14 April 2020
Referierte Publikation:Nein
Open Access:Nein
Seitenanzahl:92
Status:veröffentlicht
Stichwörter:deep learning; object detection; augmentation
Institution:Technische Universität München
Abteilung:Elektro- und Informationstechnik
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Technik für Raumfahrtsysteme
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R SY - Technik für Raumfahrtsysteme
DLR - Teilgebiet (Projekt, Vorhaben):R - Vorhaben Multisensorielle Weltmodellierung (alt)
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Institut für Robotik und Mechatronik (ab 2013) > Perzeption und Kognition
Hinterlegt von: Durner, Maximilian
Hinterlegt am:07 Dez 2020 11:01
Letzte Änderung:16 Dez 2020 14:22

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.