This content is not included in your SAE MOBILUS subscription, or you are not logged in.
Analysis of LiDAR and Camera Data in Real-World Weather Conditions for Autonomous Vehicle Operations
- Nick Goberville - Western Michigan University ,
- Mohammad El-Yabroudi - Western Michigan University ,
- Mark Omwanas - Western Michigan University ,
- Johan Rojas - Western Michigan University ,
- Rick Meyer - Western Michigan University ,
- Zachary Asher - Western Michigan University ,
- Ikhlas Abdel-Qader - Western Michigan University
ISSN: 2641-9637, e-ISSN: 2641-9645
Published April 14, 2020 by SAE International in United States
Citation: Goberville, N., El-Yabroudi, M., Omwanas, M., Rojas, J. et al., "Analysis of LiDAR and Camera Data in Real-World Weather Conditions for Autonomous Vehicle Operations," SAE Int. J. Adv. & Curr. Prac. in Mobility 2(5):2428-2434, 2020, https://doi.org/10.4271/2020-01-0093.
Autonomous vehicle technology has the potential to improve the safety, efficiency, and cost of our current transportation system by removing human error. With sensors available today, it is possible for the development of these vehicles, however, there are still issues with autonomous vehicle operations in adverse weather conditions (e.g. snow-covered roads, heavy rain, fog, etc.) due to the degradation of sensor data quality and insufficiently robust software algorithms. Since autonomous vehicles rely entirely on sensor data to perceive their surrounding environment, this becomes a significant issue in the performance of the autonomous system. The purpose of this study is to collect sensor data under various weather conditions to understand the effects of weather on sensor data. The sensors used in this study were one camera and one LiDAR. These sensors were connected to an NVIDIA Drive Px2 which operated in a 2019 Kia Niro. Two custom scenarios (static and dynamic objects) were chosen to collect sensor data operating in four real-world weather conditions: fair, cloudy, rainy, and light snow. An algorithm developed herein was used to provide a method of quantifying the data for comparison against the other weather conditions. The results from these performance algorithms show that sensor data quality degrades by an average of 13.88% for static objects and 16.16% for dynamic objects while operating in these conditions, with operations in rain proving to have the most significant effect on sensor data degradation. From this study, it is hypothesized that advancements in data processing algorithms can improve the usability of this degraded data. In future work, we seek to explore fault-tolerant sensor fusion algorithms that can overcome the effects of adverse weather.