This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Detection of Lane-Changing Behavior Using Collaborative Representation Classifier-Based Sensor Fusion

Journal Article
09-06-02-0010
ISSN: 2327-5626, e-ISSN: 2327-5634
Published October 29, 2018 by SAE International in United States
Detection of Lane-Changing Behavior Using Collaborative Representation Classifier-Based Sensor Fusion
Sector:
Citation: Gao, J., Murphey, Y., and Zhu, H., "Detection of Lane-Changing Behavior Using Collaborative Representation Classifier-Based Sensor Fusion," SAE Int. J. Trans. Safety 6(2):147-162, 2018, https://doi.org/10.4271/09-06-02-0010.
Language: English

Abstract:

Sideswipe accidents occur primarily when drivers attempt an improper lane change, drift out of lane, or the vehicle loses lateral traction. In this article, a fusion approach is introduced that utilizes data from two differing modality sensors (a front-view camera and an onboard diagnostics (OBD) sensor) for the purpose of detecting driver’s behavior of lane changing. For lane change detection, both feature-level fusion and decision-level fusion are examined by using a collaborative representation classifier (CRC). Computationally efficient detection features are extracted from distances to the detected lane boundaries and vehicle dynamics signals. In the feature-level fusion, features generated from two differing modality sensors are merged before classification, while in the decision-level fusion, the Dempster-Shafer (D-S) theory is used to combine the classification outcomes from two classifiers, each corresponding to one sensor. The results indicated that the feature-level fusion outperformed the decision-level fusion, and the introduced fusion approach using a CRC performs significantly better in terms of detection accuracy, in comparison to other state-of-the-art classifiers.