A Novel Vision-Based Framework for Real-Time Lane Detection and Tracking

2019-01-0690

04/02/2019

Event
WCX SAE World Congress Experience
Authors Abstract
Content
Lane detection is one of the most important part in ADAS because various modules (i.e., LKAS, LDWS, etc.) need robust and precise lane position for ego vehicle and traffic participants localization to plan an optimal routine or make proper driving decisions. While most of the lane detection approaches heavily depend on tedious pre-processing and great amount of assumptions to get reasonable result, the robustness and efficiency are deteriorated. To address this problem, a novel framework is proposed in this paper to realize robust and real-time lane detection. This framework consists of two branches, where canny edge detection and Progressive Probabilistic Hough Transform (PPHT) are introduced in the first branch for efficient detection. To eliminate the dependency of the framework on assumptions such as flatten road, deep learning based encoder-decoder detection branch, which leverages the powerful nonlinear approximation ability of CNN, is introduced to improve the robustness and contribute to a precise intermediate result. Since the detection rate of the CNN branch is much slower than the feature-based branch, a coordinating unit is designed. The two branches also backup each other so that the system can be failure-tolerant. Finally, Kalman filter is applied for lane tracking. Experiment result shows that the proposed framework can achieve robust detection result under various driving scenario with more than 100 FPS. A closed-loop lane keeping simulation is also carried out, which shows the precise and robust detection result from proposed framework can greatly contribute to the lane keeping performance.
Meta TagsDetails
DOI
https://doi.org/10.4271/2019-01-0690
Pages
9
Citation
Yang, S., Wu, J., Shan, Y., Yu, Y. et al., "A Novel Vision-Based Framework for Real-Time Lane Detection and Tracking," SAE Technical Paper 2019-01-0690, 2019, https://doi.org/10.4271/2019-01-0690.
Additional Details
Publisher
Published
Apr 2, 2019
Product Code
2019-01-0690
Content Type
Technical Paper
Language
English