Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment

2020-01-1029

04/14/2020

Features
Event
WCX SAE World Congress Experience
Authors Abstract
Content
Autonomous driving in unstructured environments is a significant challenge due to the inconsistency of important information for localization such as lane markings. To reduce the uncertainty of vehicle localization in such environments, sensor fusion of LiDAR, Radar, Camera, GPS/IMU, and Odometry sensors is utilized. This paper discusses a hybrid localization technique developed using: LiDAR-based Simultaneous Localization and Mapping (SLAM), GPS/IMU, Odometry data, and object lists from Radar, LiDAR, and Camera sensors. An Extended Kalman Filter (EKF) is utilized to fuse data from all sensors in two phases. In the preliminary stage, the SLAM-based vehicle coordinates are fused with the GPS-based positioning. The output of this stage is then fused with the object-based localization. This approach was successfully tested on FEV’s Smart Vehicle Demonstrator at FEV’s HQ. It represented a complicated test environment with dynamic and static objects. The test results show that multi-sensor fusion improves the vehicle’s localization compared to GPS/IMU or LiDAR alone.
Meta TagsDetails
DOI
https://doi.org/10.4271/2020-01-1029
Pages
5
Citation
Alrousan, Q., Alzu'bi, H., Pfeil, A., and Tasky, T., "Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment," SAE Technical Paper 2020-01-1029, 2020, https://doi.org/10.4271/2020-01-1029.
Additional Details
Publisher
Published
Apr 14, 2020
Product Code
2020-01-1029
Content Type
Technical Paper
Language
English