A Robust Failure Proof Driver Drowsiness Detection System Estimating Blink and Yawn
2020-01-1030
04/14/2020
- Features
- Event
- Content
- The fatal automobile accidents can be attributed to fatigued and distracted driving by drivers. Driver Monitoring Systems alert the distracted drivers by raising alarms. Most of the image based driver drowsiness detection systems face the challenge of failure proof performance in real time applications. Failure in face detection and other important part (eyes, nose and mouth) detections in real time cause the system to skip detections of blinking and yawning in few frames. In this paper, a real time robust and failure proof driver drowsiness detection system is proposed. The proposed system deploys a set of detection systems to detect face, blinking and yawning sequentially. A robust Multi-Task Convolutional Neural Network (MTCNN) with the capability of face alignment is used for face detection. This system attained 97% recall in the real time driving dataset collected. The detected face is passed on to ensemble of regression trees to detect the 68 facial landmarks. The eye and mouth landmarks are isolated to detect the blink and yawning by identifying the open/closed state by calculating their aspect ratios. The proposed system combines the two detectors with failure proof strategies. A Minimization of Sum Squared Error based object tracker is used to track the face in the video when the face detector fails. A new binary classification neural network is deployed to reinforce the detection of eye and mouth state when the face landmark detection fails. A combination of the face, landmark detector and the failure proof systems such as the object tracker and classification neural network provides a driver drowsiness detection system with nearly 99.9% accuracy in detecting blinking and yawning.
- Pages
- 6
- Citation
- Jesudoss, Y., and Park, J., "A Robust Failure Proof Driver Drowsiness Detection System Estimating Blink and Yawn," SAE Technical Paper 2020-01-1030, 2020, https://doi.org/10.4271/2020-01-1030.