To give up copyright, the authors allow that, International Journal of Psychological Research, distribute the work more broadly, check for the reuse by others and take care of the necessary procedures for the registration and administration of copyright; at the same time, our editorial board represents the interests of the author and allows authors to re-use his work in various forms. In response to the above, authors transfer copyright to the journal, International Journal of Psychological Research. This transfer does not imply other rights which are not those of authorship (for example those that concern about patents). Likewise, preserves the authors rights to use the work integral or partially in lectures, books and courses, as well as make copies for educational purposes. Finally, the authors may use freely the tables and figures in its future work, wherever make explicit reference to the previous publication in International Journal of Psychological Research. The assignment of copyright includes both virtual rights and forms of the article to allow the editorial to disseminate the work in the manner which it deems appropriate.
The editorial board reserves the right of amendments deemed necessary in the application of the rules of publication.
Background. Identification of emotional facial expressions (EFEs) is important in interpersonal communication. Six ‘universal’ EFEs are known, though accuracy of their identification varies. EFEs involve anatomical changes in certain regions of the face, especially eyes and mouth. But whether other areas of the face are just as important in their identification is still debated. This study was conducted to compare the accuracy of identification of universal EFEs under full-face and partial face conditions (only showing the eyes and the mouth regions). Methods. An analytical cross-sectional study was conducted among 140 young Indian adults. They were divided into two equal groups and shown the six universal EFEs in two sets, one with full-face images, and the other with images showing just the eyes and the mouth regions on a computer screen. The participants were asked to identify each of the EFE and their responses were analyzed. Results. Mean age was 21.3±1.7 years
for full face group, and 21.2±1.6 years for the partial face group. Most were men, from rural areas and from upper socioeconomic status families, and many of them were students. EFE identification was significantly higher for part-face group compared to full-face group (p-value .0007). Participants of both groups identified happiness the best (100%). For other EFEs, part-face images were identified more accurately than full-face images, except for disgust. These differences were statistically significant except for anger and fear. Conclusions. Among young Indian adults, accuracy of identification of universal EFEs was
high, which was significantly enhanced for all except disgust, when only combinations of eyes and mouth were shown, suggesting that other facial regions serve as distractors in EFE identification. Key Messages. 1. Identification of universal EFEs was higher from partial
faces (combination of eyes and mouth) as compared to full-face EFEs for all emotions except disgust. 2. This suggests that other regions of face serve as potential distractors in the identification of emotions, except for disgust, where these regions provide more information.
Calvo, M. G., Fernández-Martín, A., Gutiérrez-García, A., & Lundqvist, D. (2018). Selective eye fixations on diagnostic face regions of dynamic emotional expressions: KDEF-dyn database. Scientific reports, 8(1), 1-10.
Eisenbarth, H., & Alpers, G. W. (2011). Happy mouth and sad eyes: scanning emotional facial expressions. Emotion, 11(4), 860.
Elfenbein, H. A., & Ambady, N. (2002). On the universality and cultural specificity of emotion recognition: a meta-analysis. Psychological bulletin, 128(2), 203.
Gollan, J. K., McCloskey, M., Hoxha, D., & Coccaro, E. F. (2010). How do depressed and healthy adults interpret nuanced facial expressions?. Journal of abnormal psychology, 119(4), 804.
Hess, U., Blairy, S., & Kleck, R. E. (1997). The intensity of emotional facial expressions and decoding accuracy. Journal of Nonverbal Behavior, 21(4), 241-257.
Heuer, K., Lange, W. G., Isaac, L., Rinck, M., & Becker, E. S. (2010). Morphed emotional faces: emotion detection and misinterpretation in social anxiety. Journal of behavior therapy and experimental psychiatry, 41(4), 418-425.
Isaacowitz, D. M., Löckenhoff, C. E., Lane, R. D., Wright, R., Sechrest, L., Riedel, R., & Costa, P. T. (2007). Age differences in recognition of emotion in lexical stimuli and facial expressions. Psychology and aging, 22(1), 147.
Jack, R. E., Blais, C., Scheepers, C., Schyns, P. G., & Caldara, R. (2009). Cultural confusions show that facial expressions are not universal. Current biology, 19(18), 1543-1548.
Kestenbaum, R. (1992). Feeling happy versus feeling good: The processing of discrete and global categories of emotional expressions by children and adults. Developmental Psychology, 28(6), 1132.
Kirita, T., & Endo, M. (1995). Happy face advantage in recognizing facial expressions. Acta psychologica, 89(2), 149-163.
Lohani, M., Gupta, R., & Srinivasan, N. (2013). Cross-cultural evaluation of the International Affective Picture System on an Indian sample. Psychological Studies, 58(3), 233-241.
Lundqvist, D., Flykt, A., & Öhman, A. (1998). The Karolinska Directed Emotional Faces - KDEF, CD ROM from Department of Clinical Neuroscience, Psychology section, Karolinska Institutet, ISBN 91-630-7164-9.
Matsumoto, D., Keltner, D., Shiota, M. N., O'Sullivan, M., & Frank, M. (2008). Facial expressions of emotion. The Guilford Press.
Mishra, M. V., Ray, S. B., & Srinivasan, N. (2018). Cross-cultural emotion recognition and evaluation of Radboud faces database with an Indian sample. PLoS One, 13(10), e0203959.
Noyes, E., Davis, J. P., Petrov, N., Gray, K. L., & Ritchie, K. L. (2021). The effect of face masks and sunglasses on identity and expression recognition with super-recognizers and typical observers. Royal Society open science, 8(3), 201169.
Palermo, R., & Coltheart, M. (2004). Photographs of facial expression: Accuracy, response times, and ratings of intensity. Behavior Research Methods, Instruments, & Computers, 36(4), 634-638.
Prochnow, D., Donell, J., Schäfer, R., Jörgens, S., Hartung, H. P., Franz, M., & Seitz, R. J. (2011). Alexithymia and impaired facial affect recognition in multiple sclerosis. Journal of neurology, 258(9), 1683-1688.
Russo-Ponsaran, N. M., Evans-Smith, B., Johnson, J., Russo, J., & McKown, C. (2016). Efficacy of a facial emotion training program for children and adolescents with autism spectrum disorders. Journal of Nonverbal Behavior, 40(1), 13-38.
Schurgin, M. W., Nelson, J., Iida, S., Ohira, H., Chiao, J. Y., & Franconeri, S. L. (2014). Eye movements during emotion recognition in faces. Journal of vision, 14(13), 14-14.
Spitzer, R. L., Kroenke, K., Williams, J. B., Patient Health Questionnaire Primary Care Study Group, & Patient Health Questionnaire Primary Care Study Group. (1999). Validation and utility of a self-report version of PRIME-MD: the PHQ primary care study. Jama, 282(18), 1737-1744.
Troisi, A. (1999). Ethological research in clinical psychiatry: the study of nonverbal behavior during interviews. Neuroscience & Biobehavioral Reviews, 23(7), 905-913.
Wegrzyn, M., Vogt, M., Kireclioglu, B., Schneider, J., & Kissler, J. (2017). Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PloS one, 12(5), e0177239.
Wells, L. J., Gillespie, S. M., & Rotshtein, P. (2016). Identification of emotional facial expressions: Effects of expression, intensity, and sex on eye gaze. PloS one, 11(12), e0168307.