Please use this identifier to cite or link to this item:
|Title||Human Emotion Recognition Approach Based on Facial Expression, Ethnicity and Gender Using Backpropagation Artificial Neural Network|
The emotions of the human are create by God almighty since the creation of humankind. Human emotions are mostly represents based on the psychological situation of humans through facial expressions, speech, or through the movement of the body. Overall, the interactions between humans are using several factors including the knowledge of these emotions. Although people differ in their veins, races and languages, the language of emotions is almost general and comprehensive which makes it easy to understand. There are six basic emotions which usually researchers consider, these are happy, sad, fear, disgust, anger and shame. In this thesis, we study the impact of both race and gender on the accuracy of the recognition of emotion through facial expressions. We claim that knowing the gender and race would increase the accuracy of the emotion recognition. This is due to the difference between the face appearances of various races and gender. To test our claim, we developed an approach based on Artificial Neural Networks (ANN) using backpropgation algorithm to recognize the human emotion. The proposed model consists of five stages: the first stage is inputting the image. Second stage is for image preprocessing. The third is to identifying points of the face, which will help in defining the face features. The fourth stage is to extract the features and the last stage is the emotion recognition. These stages are divides into two sections: the first section consists of the first four stages and the second section consists of only the fifth stage. We have built a program to implement the first section and we used Matlab to implement the other section. Our model has been test by using MSDEF dataset, and we found that there is a positive effect on the accuracy of the recognition of emotion if we use both the ethnic group and gender as inputs to the system. Although this effect is not significant, but considerable (Improvement rate reached 8%). In addition, we found that females have a more accurate emotion expression recognition than males. In additional, regardless of the used dataset, our approach obtained better results than some researches on emotion recognition. This could be due to various reasons such as the type of the selected features and consideration of race and gender.
|Publisher||the islamic university|
|Files in this item|