1 Department of Computer Science and Engineering, Suresh Gyan Vihar University, Jaipur, Rajasthan, India.
2 Department of Electrical and Electronics Engineering, Suresh Gyan Vihar University, Jaipur, Rajasthan, India.
3 Department of Computer Science and Engineering, Swami Keshvanand Institute of Technology, Management and Gramothan, Jaipur, Rajasthan, India.
International Journal of Science and Research Archive, 2025, 17(03), 588-596
Article DOI: 10.30574/ijsra.2025.17.3.3258
Received 08 November 2025; revised on 15 December 2025; accepted on 17 December 2025
Facial expression recognition (FER) constitutes a key domain in affective computing and applied psychology, as it enables the systematic assessment of emotional states through observable facial cues. The present study examined the psychometric and methodological properties of two deep convolutional neural network architectures—AlexNet and DenseNet-201—in the automatic classification of emotional expressions using the FER-2013 dataset. Both models employed transfer learning and data augmentation procedures to enhance generalization and robustness. Comparative analyses were conducted across seven emotion categories (anger, disgust, fear, happiness, sadness, surprise, and neutrality) using standard performance indices—accuracy, precision, recall, and F1 score. AlexNet achieved a validation accuracy of 82.94%, whereas DenseNet-201 yielded 84.91%. DenseNet-201 demonstrated superior discriminative capacity, particularly in the recognition of subtle emotional states such as fear and disgust, which are often more challenging to detect both computationally and psychologically. To support interpretability and construct validity, Class Activation Mapping (CAM) was applied to identify the facial regions most influential in the classification process, offering insight into the visual cues underlying automated emotion assessment. Overall, findings highlight the methodological trade-off between model simplicity and psychometric precision: while AlexNet is suitable for efficient, lightweight applications, DenseNet-201 provides a more accurate and psychologically representative model of facial affect recognition. These results contribute to the integration of advanced computational techniques into psychometric models of emotion measurement and assessment.
Facial Expression Recognition; FER-2013; AlexNet; DenseNet-201; Deep Learning, Transfer Learning; Emotion Classification; Data Augmentation; Class Activation Mapping
Get Your e Certificate of Publication using below link
Preview Article PDF
Neha Mathur, Paresh Jain and Pankaj Dadhecch. Performance evaluation of optimized deep convolution architectures for facial emotion recognition. International Journal of Science and Research Archive, 2025, 17(03), 588-596. Article DOI: https://doi.org/10.30574/ijsra.2025.17.3.3258.
Copyright © 2025 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution Liscense 4.0







