Detection of Lane-Changing Behavior Using Collaborative Representation Classifier-Based Sensor Fusion

Features
Authors Abstract
Content
Sideswipe accidents occur primarily when drivers attempt an improper lane change, drift out of lane, or the vehicle loses lateral traction. In this article, a fusion approach is introduced that utilizes data from two differing modality sensors (a front-view camera and an onboard diagnostics (OBD) sensor) for the purpose of detecting driver’s behavior of lane changing. For lane change detection, both feature-level fusion and decision-level fusion are examined by using a collaborative representation classifier (CRC). Computationally efficient detection features are extracted from distances to the detected lane boundaries and vehicle dynamics signals. In the feature-level fusion, features generated from two differing modality sensors are merged before classification, while in the decision-level fusion, the Dempster-Shafer (D-S) theory is used to combine the classification outcomes from two classifiers, each corresponding to one sensor. The results indicated that the feature-level fusion outperformed the decision-level fusion, and the introduced fusion approach using a CRC performs significantly better in terms of detection accuracy, in comparison to other state-of-the-art classifiers.
Meta TagsDetails
DOI
https://doi.org/10.4271/09-06-02-0010
Pages
11
Citation
Gao, J., Murphey, Y., and Zhu, H., "Detection of Lane-Changing Behavior Using Collaborative Representation Classifier-Based Sensor Fusion," SAE Int. J. Trans. Safety 6(2):147-162, 2018, https://doi.org/10.4271/09-06-02-0010.
Additional Details
Publisher
Published
Oct 29, 2018
Product Code
09-06-02-0010
Content Type
Journal Article
Language
English