Estimating Driver Field of View in Active Safety Systems
2026-01-0775
To be published on 06/01/2026
- Content
- Driver Monitoring Systems (DMS) are an important component of active safety systems, continuously evaluating the driver’s state and issuing real-time warnings. As defined by the SAE Levels of Automation, driving tasks are increasingly transferred from the driver to the vehicle at Level 2, however, the driver remains fully responsible for monitoring the driving environment. Current implementations, such as Driver Drowsiness and Attention Warning (DDAW), assess driver alertness, while Advanced Driver Distraction Warning (ADDW) ensures that the driver maintains visual focus. Nevertheless, these systems do not identify the specific objects or regions the driver is observing. This limitation motivates the presented research question: can an in-car monitoring system be integrated with external environment perception sensors to infer the driver’s Field of View (FoV)? This paper presents a system consisting of a driver-facing camera and a front-view camera. Facial features, including gaze direction, head pose, and iris location are extracted using computer vision techniques. These features, together with cropped eye images, are used as inputs to a Convolutional Neural Network (CNN). Training labels were generated using a driving simulator study with 16 participants who sequentially fixated on visual targets displayed on a front screen. Experimental results show that the proposed system can predict driver visual attention and approximate FoV with a Mean Pixel Error (MPE) of 35.4 pixels, enabling identification of the regions of the road scene observed by the driver in real time. This work provides a foundation for explicitly modeling driver perception and its correspondence with vehicle perception systems.
- Citation
- Ji, D., Lausch, H., Flormann, M., and Henze, R., "Estimating Driver Field of View in Active Safety Systems," 2026 Stuttgart International Symposium, Stuttgart, Germany, July 8, 2026, .