GeoWerkstatt-Projekt des Monats Juni 2024
Projekt: Detection and reconstruction of vehicles from UAV aerial images
Forschende: Sara El Amrani, Franz Rottensteiner
Projektidee: Vehicles can be detected, reconstructed and eventually also tracked in overhead aerial images to obtain an overview of a crossing in order to contribute to road safety.
This project investigates the potential contribution of aerial images captured by a UAV for the collaborative positioning of vehicles. A UAV takes aerial images of a traffic area, e.g., a crossroads. The images show vehicles which are assumed to be able to communicate with each other and with the UAV. The vehicles are equipped with stereo cameras and can therefore position themselves relative to each other. However, due to visual restrictions these relative poses of each vehicle may have a poor geometric accuracy only. The aerial images can support the block geometry and thus the determination of vehicle position, as such an image provides a good overview of the whole scene and shows significantly less occlusions.
For this purpose, vehicles visible in a UAV image are detected automatically with a neural network. The method is extended to estimate the pose, shape, and type of the detected vehicles. On the one hand, the vertices (these are the intersections of the lines visible in the images) of the reconstructed vehicle models can serve as tie points for positioning vehicle models in object space; on the other hand, in this way the parameters of the vehicle model can be estimated consistently from the entire available information (stereo images from the vehicles, UAV images). Compared to the vehicle reconstruction method already developed as part of i.c.sens, the challenge lies in the lack of stereo information and the unfavourable viewing direction from above for the use of pre-trained classifiers. For evaluation purposes, real data is recorded in the context of the central experimentation facility, using a UAV and several vehicles equipped with stereo cameras at road crossings.
This work is supported by the DFG as part of the Research Training Group i.c.sens (RTG 2159).