This content is not included in your SAE MOBILUS subscription, or you are not logged in.
Study on Target Tracking Based on Vision and Radar Sensor Fusion
ISSN: 0148-7191, e-ISSN: 2688-3627
Published April 03, 2018 by SAE International in United States
This content contains downloadable datasetsAnnotation ability available
Faced with intricate traffic conditions, the single sensor has been unable to meet the safety requirements of Advanced Driver Assistance Systems (ADAS) and autonomous driving. In the field of multi-target tracking, the number of targets detected by vision sensor is sometimes less than the current tracks while the number of targets detected by millimeter wave radar is more than the current tracks. Hence, a multi-sensor information fusion algorithm is presented by utilizing advantage of both vision sensor and millimeter wave radar. The multi-sensor fusion algorithm is based on centralized fusion strategy that the fusion center takes a unified track management. At First, vision sensor and radar are used to detect the target and to measure the range and the azimuth angle of the target. Then, the detections data from vision sensor and radar is transferred to fusion center to join the multi-target tracking with the prediction of current tracks. Vision sensor uses Global Nearest Neighbor (GNN), and radar uses Probabilistic Data Association (PDA) for data association. For target detection, the vision sensor has high accuracy at azimuth angle and low accuracy at range, while radar has medium accuracy at azimuth angle and very high accuracy at range. The detection properties of two sensors should be considered when designing the association gate. Simulation based on real test data which was taken by a monocular camera and a 77GHz millimeter wave radar is performed in MATLAB. Simulation result indicates that the design of association gate has a great impact on fusion performance.
CitationWu, X., Ren, J., Wu, Y., and Shao, J., "Study on Target Tracking Based on Vision and Radar Sensor Fusion," SAE Technical Paper 2018-01-0613, 2018, https://doi.org/10.4271/2018-01-0613.
Data Sets - Support Documents
|[Unnamed Dataset 1]|
|[Unnamed Dataset 2]|
|[Unnamed Dataset 3]|
- Ziebinski, A., Cupek, R., Erdogan, H., Waechter, S., “A Survey of ADAS Technologies for the Future Perspective of Sensor Fusion,” in International Conference on Computational Collective Intelligence, 135-146. Springer International Publishing.
- Lundquist, C., “Automotive Sensor Fusion for Situation Awareness,” Diss. Linköping University Electronic Press, 2009.
- Weigel, H., Lindner, P., & Wanielik, G., “Vehicle Tracking with Lane Assignment by Camera and Lidar Sensor Fusion,” in Intelligent Vehicles Symposium, 2009 IEEE, 513-520.
- Alessandretti, G., Broggi, A., and Cerri, P., “Vehicle and Guard Rail Detection Using Radar and Vision Data Fusion,” IEEE Transactions on Intelligent Transportation Systems 8(1):95-105, 2007.
- Steux, B. et al., “Fade: A Vehicle Detection and Tracking System Featuring Monocular Color Vision and Radar Data Fusion,” in Intelligent Vehicle Symposium, 2002. IEEE, Vol. 2. IEEE, 2002.
- Bi, X., Tan, B., Xu, Z., and Huang, L., “A New Method of Target Detection Based on Autonomous Radar and Camera Data Fusion,” SAE Technical Paper 2017-01-1977, 2017, doi:10.4271/2017-01-1977.
- Haselhoff, A, Kummert, A., and Schneider, G., “Radar-Vision Fusion with an Application to Car-Following Using an Improved AdaBoost Detection Algorithm,” Intelligent Transportation Systems Conference, 2007. ITSC IEEE, 2007, 854-858.
- Ma, Chaoqi, WanzengCai, and YafeiLiu. “Unmanned Surveillance System Based on Radar and Vision Fusion,” Information Technology and Intelligent Transportation Systems. Springer International Publishing, 2017, 421-431, ISBN : 9783319387697.
- Chen, S., Huang, L., Bai, J., Jiang, H. et al., “Multi-Sensor Information Fusion Algorithm with Central Level Architecture for Intelligent Vehicle Environmental Perception System,” SAE Technical Paper 2016-01-1894, 2016, doi:10.4271/2016-01-1894.
- Lundquist, C., “Sensor Fusion for Automotive Applications,” Diss. Linköping University Electronic Press, 2011.
- Botha, F. J., vanDaalen, C. E., and Treurnicht, J., “Data Fusion of Radar and Stereo Vision for Detection and Tracking of Moving Objects,” Pattern Recognition Association of South Africa and Robotics and Mechatronics International Conference (PRASA-RobMech), 2016. IEEE, 2016.
- Levi, D., Zeng, S., and Bilik, I., “Radar-vision fusion for target velocity estimation,” U.S. Patent Application No. 14/876,134.
- Bar-Shalom, Y., and Li, X. R., “Multitarget-Multisensor Tracking: Principles and Techniques,” Aerospace & Electronic Systems Magazine IEEE11.21995.
- Liggins, M. E. Chee-Yee Chong , Kadar I. , Alford M.G., et al., “Distributed Fusion Architectures and Algorithms for Target Tracking,” Proceedings of the IEEE85.1:95-1071997.
- Chen, H., Kirubarajan, T., and Barshalom, Y. “Comparison of Centralized and Distributed Tracking Algorithms Using Air-to-Air Scenarios,” in Proceedings of SPIE - The International Society for Optical Engineering, vol. 4048, 2000.