Affect State Classification from Face Segments Using Resnet-50 and SE-Resnet-50
Dhananjay Theckedath1, R.R Sedamkar2

1Dhananjay Theckedath, Assistant Professor, Biomedical Engineering Department, Thadomal Shahani Engineering College, India.
2R.R. Sedamkar, Professor, Computer Engineering Department, Thakur College of Engineering & Technology, India
Manuscript received on December 13, 2019. | Revised Manuscript received on December 21, 2019. | Manuscript published on January 10, 2020. | PP: 117-122 | Volume-9 Issue-3, January 2020. | Retrieval Number: B6196129219/2020©BEIESP | DOI: 10.35940/ijitee.B6196.019320
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: One of the important components of an intelligent Human computer Interface system is accurate classification of the various affect states. Such interface systems are however plagued by a recurring problem of image occlusion. The challenge hence is to be able to classify the various affect states accurately from whatever portions of the face are available to the system. This paper attempts to investigate if there are segments within the facial region which carry sufficient information about the affect states. In this paper we have used two pre-defined Convolutional Neural networks (CNN). We have implemented a ResNet-50 network and a modified version of ResNet-50 which has a Squeeze and Excitation network connected to ResNet-50. This is called SE-ResNet-50. We use these two networks to classify seven basic affect states of Angry, Contempt, Disgust, Fear, Happy, Sad and Surprise from various segments of the face. We partition the face into four regions with each region comprising of only 50% of the original data. The results obtained are compared with that obtained using the full face. The validation accuracy values are obtained for full face as well as the four segments of the face. The paper also calculates precision and recall for each partitioned area for each of the affect states using the two networks. Our evaluation shows that both, ResNet-50 as well as SE-ResNet-50 are successful in accurately classifying all the 7 affect state from the Right segment, Left segment Lower segment and Upper segment of the face. While ResNet-50 performs marginally better compared to the SE-ResNet-50 in identifying the various affect states form the right, left and lower segments of the face, SE-ResNet-50 performs better in identifying the affect states from the upper segment of the face. We can thus conclude that right segment, left segment, lower segment and upper segments of the face contain sufficient information to correctly classify the seven affect states. The experimental results presented in this paper show that pre-defined Convolutional Neural Networks gives us very high accuracy, precision and recall values and hence can be used to accurately classify affect states even when there are occlusions present in the image and only certain portions of the face are available for analysis. 
Keywords:   Affect States, Convolutional Neural Networks, ResNet-50, SE- ResNet-50;
Scope of the Article: Classification