Feasibility study of emotion mimicry analysis in human–machine interaction

Abstract Health apps have increased in popularity as people increasingly follow the advice these apps provide to enhance physical and mental well-being. One key aspect of improving neurosensory health is identifying and expressing emotions. Emotional intelligence is crucial for maintaining and enhan...

Full description

Saved in:
Bibliographic Details
Main Authors: Herag Arabian, Tamer Abdulbaki Alshirbaji, Ashish Bhave, Verena Wagner-Hartl, Marcel Igel, J. Geoffrey Chase, Knut Moeller
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-025-87688-z
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Health apps have increased in popularity as people increasingly follow the advice these apps provide to enhance physical and mental well-being. One key aspect of improving neurosensory health is identifying and expressing emotions. Emotional intelligence is crucial for maintaining and enhancing social interactions. In this context, a preliminary closed-loop feedback system has been developed to help people project specific emotions by altering their facial expressions. This system is part of a research intervention aimed at therapeutic applications for individuals with autism spectrum disorder. The proposed system functions as a digital mirror, initially displaying an animated avatar’s face expressing a predefined emotion. Users are then asked to mimic the avatar’s expression. During this process, a custom emotion recognition model analyzes the user’s facial expressions and provides feedback on the accuracy of their projection. A small experimental study involving 8 participants tested the system for feasibility, with avatars projecting the six basic emotions and a neutral expression. The study results indicated a positive correlation between the projected facial expressions and the emotions identified by participants. Participants effectively recognized the emotions, with 85.40% accuracy demonstrating the system’s potential in enhancing the well-being of individuals. The participants were also able to mimic the given expression effectively with an accuracy of 46.67%. However, a deficiency in the performance of one of the expressions, surprise, was noticed. In the post processing, this issue was addressed and model enhancements were tailored to boost the performance by ~ 30%. This approach shows promise for therapeutic use and emotional skill development. A further wider experimental study is still required to validate the findings of this study and analyze the impact of modifications made.
ISSN:2045-2322