Target Geolocation Method Based on Monocular Vision

2025-99-0021

10/17/2025

Authors Abstract
Content
In the intelligent traffic system (ITS), roadside sensing can obtain the movement status of various objects in the traffic scene in real time with a globalized perspective, which is of great significance for traffic flow optimization, accident early warning, and rescue afterwards. Accurate target positioning is one of the key links to realize these functions, which can not only help the traffic management department to grasp the traffic condition in time, but also provide the basis for rescue personnel to respond quickly when an accident occurs, so as to minimize the damage caused by the accident. Therefore, a method for acquiring the Global Positioning System (GPS) coordinates of objects relying on monocular surveillance installed on the roadside is proposed in this paper. By combining the target detection algorithm and the coordinate transformation method, and considering the information such as the installation status and internal parameters of the camera, the pixel positions of objects of interest are converted to GPS coordinates under the Global Navigation Satellite System (GNSS) by two different methods according to the known conditions in different situations. In order to evaluate the accuracy and stability of the method in practical applications, several sets of experiments in real scenes are conducted. The results of the experiments show that the latitude and longitude information of the objects in the camera-monitored scene can be estimated by our method in different intervals from the camera. Meanwhile, comparative analysis with other localization methods demonstrates the higher accuracy, feasibility, and superiority of our method.
Meta TagsDetails
DOI
https://doi.org/10.4271/2025-99-0021
Pages
8
Citation
Zhang, N., Lu, M., Chen, Z., Zhang, F. et al., "Target Geolocation Method Based on Monocular Vision," SAE Technical Paper 2025-99-0021, 2025, https://doi.org/10.4271/2025-99-0021.
Additional Details
Publisher
Published
Oct 17
Product Code
2025-99-0021
Content Type
Technical Paper
Language
English