This content is not included in your SAE MOBILUS subscription, or you are not logged in.
A Localization System for Autonomous Driving: Global and Local Location Matching Based on Mono-SLAM
ISSN: 0148-7191, e-ISSN: 2688-3627
Published August 07, 2018 by SAE International in United States
Annotation ability available
The utilization of the SLAM (Simultaneous Localization and Mapping) technique was extended from the robotics to the autonomous vehicles for achieving the positioning. However, SLAM cannot obtain the global position of the vehicle but a relative one to the start. For sake of this, a fast and accurate system was proposed to obtain both the local position and the global position of vehicles based on mono-SLAM which realized the SLAM by using monocular camera with a lower cost and power consumption. Firstly, the rough latitude and longitude of current position was obtained by using common GPS without differential signal. Then, the Mono-SLAM operated on the consecutive video frames to generate the localization and local trajectory and its accuracy was further improved by utilizing the IMU information. After that, a piece of Map centered in the rough position obtained by common GPS was downloaded from the Open Street Map. Finally, a searching process in the downloaded Map was executed by using chamfer matching algorithm to find a piece of path matched with the constructed trajectory. Consequently, the global position of the vehicle was obtained and the accumulated error can be decreased with cyclical searching. In the test, the performance of this proposed system outperforms current approaches in global location and its error was less than 5 meters, indicating in parallel the potential that mono-SLAM can bring to the global localization task.
CitationXu, Z., Chen, S., Bai, J., Huang, L. et al., "A Localization System for Autonomous Driving: Global and Local Location Matching Based on Mono-SLAM," SAE Technical Paper 2018-01-1610, 2018, https://doi.org/10.4271/2018-01-1610.
- Thrun, S., “Toward Robotic Cars,” Communications of the ACM 53(4):99-106, 2010.
- Levinson, J., Askeland, J., Becker, J., Dolson, J., Held, D., Kammel, S., Kolter, J., Langer, D., Pink, O., Pratt, Y., et al., “Towards Fully Autonomous Driving: Systems and Algorithms,” in IEEE Intelligent Vehicles Symposium, 2011.
- Smith, R., Self, M., and Cheeseman, P., “Estimating Uncertain Spatial Relationships in Robotics,” Machine Intelligence & Pattern Recognition 5(5):435-461, 1988.
- Davison, A.J., Reid, I.D., Molton, N.D. et al., “MonoSLAM: Real-Time Single Camera SLAM,” IEEE Transactions on Pattern Analysis & Machine Intelligence 29(6):1052, 2007.
- Klein, G. and Murray, D., “Parallel Tracking and Mapping for Small AR Workspaces,” IEEE and ACM International Symposium on Mixed and Augmented Reality, 2008, 1-10. IEEE.
- Newcombe, R.A., Lovegrove, S.J., and Davison, A.J., “DTAM: Dense Tracking and Mapping in Real-Time,” International Conference on Computer Vision, 2011, 2320-2327. IEEE Computer Society.
- Engel, J., Schöps, T., and Cremers, D., “LSD-SLAM: Large-Scale Direct Monocular SLAM,” European Conference on Computer Vision, Vol. 8690, 2014, 834-849.
- Engel, J., Koltun, V., and Cremers, D., “Direct Sparse Odometry,” IEEE Transactions on Pattern Analysis & Machine Intelligence 99:1-1, 2017.
- Mur-Artal, R., Montiel, J.M.M., and Tardós, J.D., “ORB-SLAM: A Versatile and Accurate Monocular SLAM System,” IEEE Transactions on Robotics 31(5):1147-1163, 2017.
- Mur-Artal, R. and Tardós, J.D., “Visual-Inertial Monocular SLAM with Map Reuse,” IEEE Robotics & Automation Letters 2(2):796-803, 2017.
- Forster, C., Carlone, L., Dellaert, F. et al. “IMU Preintegration on Manifold for Efficient Visual-Inertial Maximum-A-Posteriori Estimation,” Georgia Institute of Technology, 2015.
- Forster, C., Zhang, Z., Gassner, M. et al., “SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems,” IEEE Transactions on Robotics 33(2):249-265, 2017.
- Li, P., Qin, T., Hu, B et al., “Monocular Visual-Inertial State Estimation for Mobile Augmented Reality,” 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2017, 11-21. IEEE.
- Georgios, F., Benito Van Der, Z., and Bastian, L., “OpenStreetSLAM: Global Vehicle Localization Using Open Street Maps,” IEEE International Conference on Robotics and Automation, 2013, 1054-1059. IEEE.
- Conrado, A., “Implementation of a Quaternion-Based Kalman Filter for Human Body Motion Tracking Using MARG Sensors.”
- Yun, X.P., Bachmann, E.R., Mcghee, R.B., “A Simplified Quaternion-Based Algorithm for Orientation Estimation from Earth Gravity and Magnetic Field Measurements.”
- Galvez-López, D. and Tardos, J.D., “Bags of Binary Words for Fast Place Recognition in Image Sequences,” IEEE Trans on Robotics 28(5):1188-1197, 2012.
- Hervier T, Bonnabel S, Goulette F. “Accurate 3D Maps from Depth Images and Motion Sensors via Nonlinear Kalman Filtering,” IEEE/RSJ International Conference on Intelligent Robots and Systems, 2012, 5291-5297. IEEE.
- Haklay, M. and Weber, P., “OpenStreetMap: User-Generated Street Maps,” IEEE Pervasive Computing 7(4):12-18, 2008.
- Fréchet, M. and Maurice, “Sur Quelques Points du Calcul Fonctionnel,” Rendiconti del Circolo Matematico di Palermo (1884-1940) 22.1:1-72, 1906.
- Ioannis, R., “A Particle Filter Tutorial for Mobile Robot Localization,” Centre for Intelligent Machines 2(2):481-488, 2003.