Loading

Facial Expression Detection using Deep Neural Networks
M. Sandeep Reddy1, Ch. Chinmai2, B. Sai Teja3, P. M. Ashok Kumar4

1Dr. P. M. Ashok Kumar, Assoc. Prof., Department of Computer Science, K L Educational Foundation, Guntur, India.
2M. Sandeep Reddy, Department of Computer Science, K L Educational Foundation, Guntur, India.
3Ch. Chinmai, Department of Computer Science, K L Educational Foundation, Guntur, India.
4B. Sai Teja, Department of Computer Science, K L Educational Foundation, Guntur, India.
Manuscript received on February 06, 2020. | Revised Manuscript received on February 10, 2020. | Manuscript published on February 30, 2020. | PP: 1318-1320 | Volume-9 Issue-3, February, 2020. | Retrieval Number: C5340029320/2020©BEIESP | DOI: 10.35940/ijeat.C5340.029320
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Facial Expression conveys nonverbal communication, which plays an important role in acquaintance among people. The facial expression detection system is an activity to identify the emotional state of the person. In this system, a captured frame is compared with trained data set that is available in the database and then state of the captured frame is defined. This system is based on Image Processing and Machine Learning. For designing a robust facial feature descriptor, we apply the Xception Modelling algorithm. The detection performance of the proposed method will be evaluated by loading the dataset and pre-processing the images for feeding it to CNN model. Experimental results with prototypic expressions show the superiority of the Xception-Model descriptor against some well-known appearance-based feature representation methods. Experimental results demonstrate the competitive classification accuracy for our proposed method.
Keywords: CNN, Facial Expression, ImageNet, Xception Modelling.