This content is not included in your SAE MOBILUS subscription, or you are not logged in.
Joint Calibration of Dual LiDARs and Camera Using a Circular Chessboard
ISSN: 0148-7191, e-ISSN: 2688-3627
Published April 14, 2020 by SAE International in United States
Annotation ability available
Environmental perception is a crucial subsystem in autonomous vehicles. In order to build safe and efficient traffic transportation, several researches have been proposed to build accurate, robust and real-time perception systems. Camera and LiDAR are widely equipped on autonomous self-driving cars and developed with many algorithms in recent years. The fusion system of camera and LiDAR provides state-of the-art methods for environmental perception due to the defects of single vehicular sensor. Extrinsic parameter calibration is able to align the coordinate systems of sensors and has been drawing enormous attention. However, differ from spatial alignment of two sensors’ data, joint calibration of multi-sensors (more than two sensors) should balance the degree of alignment between each two sensors. In this paper, we assemble a test platform which is made up of dual LiDARs and one monocular camera and use the same sensing hardware architecture as intelligent sweeper designed by our laboratory. Meanwhile, we propose the related joint calibration method using a circular chessboard. The center of circular chessboard is respectively detected in camera image to get pixel coordinates and in point cloud of LiDAR to get 3D coordinates. The calibration problem is then converted into a 3D-2D PnP matching problem and the center of the chessboard is set as corresponding points to construct the geometric constraints to get initial calibration values. Further, a proper global loss function is elaborately designed for Levenberg-Marquardt nonlinear optimization to obtain the final calibration parameters, and then the extrinsic parameters between any two sensors are estimated simultaneously. Experimental results show that the proposed method is suitable for the joint calibration of fusion system composed of LiDARs and camera, and the calibration results have high accuracy and stability.
CitationDeng, Z., Xiong, L., Yin, D., and Shan, F., "Joint Calibration of Dual LiDARs and Camera Using a Circular Chessboard," SAE Technical Paper 2020-01-0098, 2020, https://doi.org/10.4271/2020-01-0098.
- Persic, J. , “Calibration of Heterogeneous Sensor Systems,” arXiv, 2018, doi:1812.11445.
- Le, Q.V. and Ng, A.Y. , “Joint Calibration of Multiple Sensors,” IEEE IROS, 2009, doi:10.1109/IROS.2009.5354272.
- Zhang, Z. , “A Flexible New Technique for Camera Calibration,” IEEE Tran. Pattern Analysis and Machine Intelligence 22(11):1330-1334, 2000, doi:10.1109/34.888718.
- Scaramuzza, D. , “Omnidirectional Vision: From Calibration to Robot Motion Estimation,” ETH Zurich, Ph.D. thesis, no. 17635, 2008.
- Atanacio, G., Barbosa, J., Hurtado, J. et al. , “LIDAR Velodyne HDL-64E Calibration Using Pattern Planes,” International Journal of Advanced Robotic Systems 8(5):59, 2011, doi:10.5772/50900.
- Muhammad, N. and Lacroix, S. , “Calibration of a Rotating Multibeam Lidar,” IEEE IROS, 2010, doi:10.1109/IROS.2010.5651382.
- Castorena, J., Kamilov, U., and Boufounos, P. , “Auto-Calibration of LIDAR and Optical Cameras via Edge Alignment,” IEEE ICASSP, 2016, doi:10.1109/ICASSP.2016.7472200.
- Pandey, G., McBride, J.R., Savarese, S. et al. , “Automatic Extrinsic Calibration of Vision and LiDAR by Maximizing Mutual Information,” Journal of Field Robotics 32(5):696-722, 2015, doi:10.1002/rob.21542.
- Dhall, A., Chelani, K., Radhakrishnan, V. et al , “LiDAR-Camera Calibration using 3D-3D Point Correspondences,” arXiv, 2017, doi:1705.09785.
- Zhang, Q. and Pless, R. , “Extrinsic Calibration of a Camera and Laser Range Finder,” IEEE IROS, 2004, doi:10.1109/IROS.2004.1389752.
- Wang, W., Sakurada, K., and Kawaguchi, N. , “Reflectance Intensity Assisted Automatic and Accurate Extrinsic Calibration of 3D LiDAR and Panoramic Camera Using a Printed Chessboard,” Remote Sensing 9(8):1-24, 2017, doi:10.3390/rs9080851.
- Xie, Y., Shao, R., Guli, P., et al , “Infrastructure Based Calibration of a Multi-Camera and Multi-Lidar System Using Apriltags,” IEEE Intelligent Vehicles Symposium (IV), 2018, doi:10.1109/IVS.2018.8500646.
- Domhof, J., Kooij, J., and Gavrila, D. , “An Extrinsic Calibration Tool for Radar, Camera and Lidar,” IEEE ICRA, 2019, doi:10.1109/ICRA.2019.8794186.
- Zhao, J., Xu, H., Liu, H. et al. , “Detection and Tracking of Pedestrians and Vehicles Using Roadside,” Transportation Research Part C 100:68-87, 2019, doi:10.1016/j.trc.2019.01.007.
- Miki , “Fitting a Circle to Cluster of 3D Points,” 2016, https://meshlogic.github.io/posts/jupyter/curve-fitting/fitting-a-circle-to-cluster-of-3d-points/.
- Lepetit, V. and Moreno, F. , “EPnP: An Accurate O(n) Solution to the PnP Problem,” International Journal of Computer Vision 81:155-166, 2019, doi:10.1007/sll263-008-0l52-6.
- Penate, A., Andrade, J., and Moreno, F. , “Exhaustive Linearization for Robust Camera Pose and Focal Length Estimation,” IEEE Transactions on Pattern Analysis and Machine Intelligence 35(10):2387-2400, 2013, doi:10.1109/TPAMI.2013.36.
- Marquardt, D.W. , “An Algorithm for Least-Squares Estimation of Nonlinear parameters,” Journal of The Society for Industrial and Applied Mathematics 11(2):431-441, 1963, doi:10.1137/0111030.