Wischow, Maik (2024) Camera Self-Health-Maintenance by means of Sensor Artificial Intelligence. Dissertation, Technische Universität Berlin. doi: 10.14279/depositonce-19778.
PDF
29MB |
Offizielle URL: https://depositonce.tu-berlin.de/items/26e0a2bb-250a-4708-82b8-be4e92e861de
Kurzfassung
Autonomous machines require increasingly more robustness and reliability to meet the demands of modern tasks. These requirements specially apply to cameras onboard such machines, as they are the predominant sensors acquiring information about the environment to support decision making and actuation. Hence, the cameras must maintain their own functionality. This poses a significant challenge, primarily driven by the variety of existing cameras, the vast amount of potential application scenarios, and the limited machine resources, all while demanding real-time performance. Existing solutions are typically tailored to specific problems or detached from the downstream computer vision tasks of the machines, which, however, determine the requirements on the quality of the produced camera images. This thesis presents a camera self-health-maintenance framework to bridge this gap. The approach combines a generalized condition monitoring and a task-oriented decision & control unit. The monitoring is based on novel learning-based blur and noise estimators that incorporate physical knowledge about the camera to increase consistency and robustness. Especially the incorporation of camera metadata enables the system to disambiguate the contributions of different noise processes within a camera. In this manner alone, the decision & control unit can initiate appropriate countermeasures, if necessary. To this end, camera parameters are readjusted based on an empirical image task analysis to optimize performance under any situation. The framework is evaluated on synthetic and real datasets from transportation and robotic scenarios in terms of accuracy, robustness and real-time capability. Firstly, the blur and noise estimators are examined and two extensions are analyzed, which recover the estimation of combined blur/noise corruptions and reduce estimation uncertainties, respectively. Secondly, the effect of an acquired image and the camera`s metadata on noise source estimation is investigated. This method is further demonstrated on the detection of mismatches between both inputs (image and camera metadata) to quantify unexpected noise as from camera defects. Lastly, the framework is implemented and verified on a real robot system. The real demonstration on a robot shows promising results to employ the framework for arbitrary mobile machines in unknown environments. In particular, the proposed framework outperforms standard camera parameter controllers. Yet, the results also highlight current limitations that require framework extensions in future studies, such as the application to complex non-linear motion blur and scenes with high dynamic light intensity ranges.
elib-URL des Eintrags: | https://elib.dlr.de/203590/ | ||||||||
---|---|---|---|---|---|---|---|---|---|
Dokumentart: | Hochschulschrift (Dissertation) | ||||||||
Titel: | Camera Self-Health-Maintenance by means of Sensor Artificial Intelligence | ||||||||
Autoren: |
| ||||||||
Datum: | 1 Februar 2024 | ||||||||
Open Access: | Ja | ||||||||
DOI: | 10.14279/depositonce-19778 | ||||||||
Seitenanzahl: | 202 | ||||||||
Status: | veröffentlicht | ||||||||
Stichwörter: | sensor artificial intelligence; machine learning; condition monitoring; camera physics; blur estimation; noise estimation | ||||||||
Institution: | Technische Universität Berlin | ||||||||
Abteilung: | Institut für Technische Informatik und Mikroelektronik, Fachgebiet Robotic Interactive Perception | ||||||||
HGF - Forschungsbereich: | keine Zuordnung | ||||||||
HGF - Programm: | keine Zuordnung | ||||||||
HGF - Programmthema: | keine Zuordnung | ||||||||
DLR - Schwerpunkt: | Digitalisierung | ||||||||
DLR - Forschungsgebiet: | D IAS - Innovative autonome Systeme | ||||||||
DLR - Teilgebiet (Projekt, Vorhaben): | D - SKIAS | ||||||||
Standort: | Berlin-Adlershof | ||||||||
Institute & Einrichtungen: | Institut für Optische Sensorsysteme > Echtzeit-Datenprozessierung Institut für Optische Sensorsysteme | ||||||||
Hinterlegt von: | Irmisch, Patrick | ||||||||
Hinterlegt am: | 09 Apr 2024 08:53 | ||||||||
Letzte Änderung: | 09 Apr 2024 08:53 |
Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags