Staged transfer learning for multi-label half-face emotion recognition
Abstract As fundamental drivers of human behavior, emotions can be expressed through various modalities, including facial expressions. Facial emotion recognition (FER) has emerged as a pivotal area of affective computing, enabling accurate detection of human emotions from visual cues. To enhance the...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
SpringerOpen
2025-05-01
|
| Series: | Journal of Engineering and Applied Science |
| Subjects: | |
| Online Access: | https://doi.org/10.1186/s44147-025-00615-x |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract As fundamental drivers of human behavior, emotions can be expressed through various modalities, including facial expressions. Facial emotion recognition (FER) has emerged as a pivotal area of affective computing, enabling accurate detection of human emotions from visual cues. To enhance the efficiency and maintain accuracy, we propose a novel approach that leverages deep learning and transfer learning techniques to classify emotions based on only half of the human face. We introduce EMOFACE, a comprehensive half-facial imagery dataset annotated with 25 distinct emotion labels, providing a diverse and inclusive resource for multi-label half-facial emotion classification. By combining this dataset with the established FER2013 dataset, we employ a staged transfer learning framework that effectively addresses the challenges of multi-label half facial emotion classification. Our proposed approach, which utilizes a custom convolutional neural network (ConvNet) and five pre-trained deep learning models (VGG16, VGG19, DenseNet, MobileNet, and ResNet), achieves impressive results. We report an average binary accuracy of 0.9244 for training, 0.9152 for validation, and 0.9138 for testing, demonstrating the efficacy of our method. The potential applications of this research extend to various domains, including affective computing, healthcare, robotics, human–computer interaction, and self-driving cars. By advancing the field of half-facial multi-label emotion recognition, our work contributes to the development of more intuitive and empathetic human–machine interactions. |
|---|---|
| ISSN: | 1110-1903 2536-9512 |