elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Triggered trail camera images and machine learning based computer vision as alternative to established visitor monitoring approaches?

Mayer, Marius und Staab, Jeroen und Udas, Erica und Taubenböck, Hannes (2021) Triggered trail camera images and machine learning based computer vision as alternative to established visitor monitoring approaches? 10th International Conference on Monitoring and Management of Visitors in Recreational and Protected Areas, 2021-08-16 - 2021-08-19, Lillehammer.

Dieses Archiv kann nicht den Volltext zur Verfügung stellen.

Kurzfassung

Visitor monitoring is crucial for many management and valuation tasks in protected areas and other recreational landscapes. Its core data are visitor numbers which are costly to estimate in absence of entry fees. Camera-based approaches have the potential to be both, accurate and deliver comprehensive data about visitor numbers, types and activities. So far, camera-based visitor monitoring is, however, costly due to time consuming manual image evaluation (Miller et al. 2017). To overcome this limitation, we deployed a convolutional neural network (CNN) and compared its hourly counts against existing visitor counting methods such as manual in-situ counting, a pressure sensor, and manual camera image evaluations. The study site is Eldena Forest Nature Reserve (EFNR), which is located at the southeastern rim of the city of Greifswald, Germany. The forest is owned by Greifswald University and is spread over 411 ha. Given its close vicinity to the city of Greifswald, EFNR is frequently visited; however, exact visitor numbers have never been estimated so far (Udas et al. 2018). There are seven major entry points into EFNR, and based on the communication with local foresters, the entries at A, B, C and D are the ones mostly used due to their proximity to residential areas. The methodology is explained in detail in Staab et al. 2021. While conducting systematic visitor monitoring in EFNR, three different visitor counting methods were deployed at different entrances in 2015. At all seven entrances, manual in-situ visitor counting was carried out as a fundamental benchmark following a visitor counting method used in many protected areas (Mayer et al. 2010). To determine the annual cumulative number of visits in EFNR, the actual data from the sampled days were extrapolated. The extrapolation procedure accounted for seasonality, weekends/weekdays and the weather situation and is a standard procedure used by many studies (Mayer et al. 2009). In addition, at one of the most frequented entrances, entrance A, a pressure sensor was installed to capture the seasonal variation of visitation, and triggered trail cameras were installed at entrances B, C, D and E. The images were first evaluated manually using three semantics of image interpretation to estimate visitor numbers. The number of visitors entering the EFNR was documented on an hourly basis for each day in a spreadsheet. Regarding the automated image analyses, advanced computer vision technologies such as deep CNNs can detect pedestrians at very high accuracies. We used a pre-trained image analyzing framework developed by Redmon et al. (2016), respectively Redmon & Farhadi (2017, 2018) and, as its name indicates, You Only Look Once (YOLO) is very fast in grasping an image’s content. As a result of the versatile training data, the pre-trained algorithm detects several object classes in an image, among which are persons, bicycles, backpacks and dogs – categories of special interest to characterize visitors in recreational landscapes. We directly compared the results of all counting approaches. For each entrance the raw, hourly results per counting approach were set against each other. The statistical deviations were measured using Pearson’s correlation and a linear model without intercept. At entrance A, where manual in-situ observations were conducted next to the pressure sensor on five days, we found a strong and highly significant correlation (0.783, p<0.001). The respective linear model fitted through the 50 hours of simultaneous observations further revealed that the automated approach does account for 88.4% of the visits counted by the manual in-situ observer (adj. R² = 0.799, p<0.001). Regarding the other two entrances where manual in-situ counting was conducted along with ongoing camera observations, only the manual and YOLO camera evaluations at entrance B correlate significantly with the corresponding 44 hours of manual in-situ observations due to too low sample size at entrance C. However, the regression models comparing the two camera evaluation approaches against manual in-situ observations are significant at entrances B and C. When comparing the automated and the manual image evaluations of entrances B-D against each other, both approaches strongly correlate at very high significance levels (Ør = 0.818, p<0.001). Further, the results of the automated image evalutation also often correlate significantly but with low to medium strength to the in-situ personal counts and the pressure sensor at the other entrances. This shows that YOLO is able to reflect the visitation trends over the year even though observations did not always take place exactly at the same locations. Thus, the results show that the CNN derived comparable visitor numbers to the other visitor counting approaches regarding visitation patterns and numbers of visits. Therefore, we conclude that it is a fast and reliable method that could be used in protected areas as well as in a much wider array of visitor counting settings in other recreational landscapes. The approach also allows for counting dogs and recreational equipment such as backpacks and bicycles in automatic manner. While the accuracy of these categories has not been assessed yet, we suppose this new monitoring approach shall help managing particular user groups and helps avoiding conflicts in such recreational areas. Nevertheless, camera installation takes time and effort, regular maintenance (batteries, storage cards) and the automated evaluation requires specific hardware and expertise as well. Along ethical and legal concerns, other practical issues are theft, vandalism and the short lifespan of batteries, especially in the winter season.

elib-URL des Eintrags:https://elib.dlr.de/143908/
Dokumentart:Konferenzbeitrag (Vortrag)
Titel:Triggered trail camera images and machine learning based computer vision as alternative to established visitor monitoring approaches?
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Mayer, MariusMarius.Mayer (at) uibk.ac.atNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Staab, JeroenJeroen.Staab (at) dlr.dehttps://orcid.org/0000-0002-7342-4440NICHT SPEZIFIZIERT
Udas, EricaErica.Udas (at) icimod.orgNICHT SPEZIFIZIERTNICHT SPEZIFIZIERT
Taubenböck, HannesHannes.Taubenboeck (at) dlr.dehttps://orcid.org/0000-0003-4360-9126NICHT SPEZIFIZIERT
Datum:August 2021
Referierte Publikation:Nein
Open Access:Nein
Gold Open Access:Nein
In SCOPUS:Nein
In ISI Web of Science:Nein
Status:veröffentlicht
Stichwörter:visitor monitoring; computer vision; convolutional neural network; camera; protected areas
Veranstaltungstitel:10th International Conference on Monitoring and Management of Visitors in Recreational and Protected Areas
Veranstaltungsort:Lillehammer
Veranstaltungsart:internationale Konferenz
Veranstaltungsbeginn:16 August 2021
Veranstaltungsende:19 August 2021
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Erdbeobachtung
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R EO - Erdbeobachtung
DLR - Teilgebiet (Projekt, Vorhaben):R - Fernerkundung u. Geoforschung
Standort: Oberpfaffenhofen
Institute & Einrichtungen:Deutsches Fernerkundungsdatenzentrum > Georisiken und zivile Sicherheit
Hinterlegt von: Staab, Jeroen
Hinterlegt am:21 Sep 2021 13:07
Letzte Änderung:24 Apr 2024 20:43

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.