Real-time fear emotion recognition in mice based on multimodal data fusion

Abstract A multimodal emotion recognition method that utilizes facial expressions, body postures, and movement trajectories to detect fear in mice is proposed in this study. By integrating and analyzing these distinct data sources through feature encoders and attention classifiers, we developed a ro...

Full description

Saved in:
Bibliographic Details
Main Authors: Hao Wang, Zhanpeng Shi, Ruijie Hu, Xinyi Wang, Jian Chen, Haoyuan Che
Format: Article
Language:English
Published: Nature Portfolio 2025-04-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-025-95483-z
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract A multimodal emotion recognition method that utilizes facial expressions, body postures, and movement trajectories to detect fear in mice is proposed in this study. By integrating and analyzing these distinct data sources through feature encoders and attention classifiers, we developed a robust emotion classification model. The performance of the model was evaluated by comparing it with single-modal methods, and the results showed significant accuracy improvements. Our findings indicate that the multimodal fusion emotion recognition model enhanced the precision of emotion detection, achieving a fear recognition accuracy of 86.7%. Additionally, the impacts of different monitoring durations and frame sampling rates on the achieved recognition accuracy were investigated in this study. The proposed method provides an efficient and simple solution for conducting real-time, comprehensive emotion monitoring in animal research, with potential applications in neuroscience and psychiatric studies.
ISSN:2045-2322