This content is not included in your SAE MOBILUS subscription, or you are not logged in.
Human Emotion Based Interior Lighting Control
ISSN: 0148-7191, e-ISSN: 2688-3627
Published April 03, 2018 by SAE International in United States
Annotation ability available
In recent years, research on Human Computer Interaction (HCI) based on emotion recognition using behavioral and physiological signals have attracted immense interest in research circles. Lighting inside the automotive make us feel differently about our driving and how we feel or behave. From the literature, it is observed that ambient lighting makes an impact on the driving experience and it delivers an emotional atmosphere inside the automotive. Driving fatigue can be reduced if the lighting is controlled properly. These days, ambient interior lighting can be considered to be the point of fashion for high end automotive and also impact driver’s mood and comfort. There are different types of automotive based lighting automation systems available but emotion based control is in early or nascent stages of research. Speech controlled light control systems, control the light by the recognition of speech of the user and by using facial expressions lighting can be controlled. Facial/speech signals consist of both outward physical expression and the inborn emotions. These emotional signals thus exhibited vary from situation to situation and are mostly dependent on the conditions. In this work, we attempted an emotion based interior lighting control. Based on the emotions observed through the Emotion Recognition System (ERS), the lighting can be modified in a predefined fashion. In the proposed ERS, five types of emotions are considered, like happy, sad, angry, neutral and disgust. Live image expression and voice data of the driver/passengers are considered as inputs to the system and based on the output from the ERS, interior lighting is controlled. Standard databases are used for training ERS system. The proposed algorithm is tested, and the results demonstrate the approach and the reliability of the method to obtain the solution for lighting control. Machine Learning methods like Convolution Neural Networks (CNN) are used for classification of features in ERS system.
CitationNandyala, S., K, G., D H, S., and Manalikandy, M., "Human Emotion Based Interior Lighting Control," SAE Technical Paper 2018-01-1042, 2018, https://doi.org/10.4271/2018-01-1042.
- Savadi, A. and Patil, C.V., “Face Based Automatic Human Emotion Recognition”, International Journal of Computer Science and Network Security, vol. 14 No. 7, July 2014.
- Rath, S.K. and Rautaray, S.S., “A Survey on Face Detection and Recognition Techniques in Different Application Domain”, IJMECS2014-8-3434, doi:10.5815/2014-08-05.
- “Kamboj, R. and Sarita, “Face Recognition and Emotion Classification:Review”, IJCST Vol. 5, Issue 2,. In: April-June. (2014).
- Bhandare, A., Bhide, M., Gokhale, P., and Chandavarkar, R., Applications of Convolutional Neural Networks”, International Journal of Computer Science and Information Technologies, Vol. 7 (5) , 2016, 2206-2215.
- Devikar, P., “Face Liveness and Disguise Detection using Raspberry Pi and OpenCV”, IJIRCCE, doi:10.15680/2017-0501018.
- Garg, A. and Dr. Bajaj, R., “Dr. Rohit Bajaj,“Facial Emotion Recognition and Classification Using Hybridization Method”, International Journal of Engineering Research and General Science , Volume 3, Issue 3, In: May-June. (2015).
- Rath, S.K. and Rautaray, S.S., “A Survey on Face Detection and Recognition Techniques in Different Application Domain,”. In: International Journal of Modern Education and Computer Science. (2014).
- Bozkurt, E., Erzin E., Erdem Ç.ˇ.E.ˇ., Erdem A.T. “Formant Position Based Weighted Spectral Features for Emotion Recognition”, Speech Communication, 2011/11/12, 53, 1186, 1197.
- Smitha, K.G. and Vinod, A.P., “Low Complexity FPGA Implementation of Emotion Detection for Autistic Children,”. In: 2013, 7th International Symposium on Medical Information and Communication Technology (ISMICT). (2013).