Your Selections

Pfeil, Andrew
Show Only

Collections

File Formats

Content Types

Dates

Sectors

Topics

Authors

Publishers

Affiliations

Events

   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment

FEV North America Inc.-Qusay Alrousan, Hamzeh Alzu'bi, Andrew Pfeil, Tom Tasky
  • Technical Paper
  • 2020-01-1029
To be published on 2020-04-14 by SAE International in United States
Autonomous driving in unstructured environments is a significant challenge due to the inconsistency of important information for localization such as lane markings. To reduce the uncertainty of vehicle localization in such environments, sensor fusion of LiDAR, Radar, Camera, GPS/IMU, and Odometry sensors is utilized. This paper discusses a hybrid localization technique developed using: LiDAR-based Simultaneous Localization and Mapping (SLAM), GPS/IMU, Odometry data, and object lists from Radar, LiDAR, and Camera sensors. An Extended Kalman Filter (EKF) is utilized to fuse data from all sensors in two phases. In the preliminary stage, the SLAM-based vehicle coordinates are fused with the GPS-based positioning. The output of this stage is then fused with the object-based localization. This approach was successfully tested on FEV’s Smart Vehicle Demonstrator at FEV’s HQ. It represented a complicated test environment with dynamic and static objects. The test results show that multi-sensor fusion improves the vehicle’s localization compared to GPS/IMU or LiDAR alone.