The Development of Human-Robot Interaction Design for Optimal Emotional Expression in Social Robots Used by Older People: Design of Robot Facial Expressions and Gestures
Showing facial expressions and using emotion-appropriate gestures are essential for social robots. As a robot’s behavior becomes more anthropomorphic, the intimacy and naturalness of human-robot interactions improve. This study aims to derive optimized facial expression and gesture design...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10855395/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Showing facial expressions and using emotion-appropriate gestures are essential for social robots. As a robot’s behavior becomes more anthropomorphic, the intimacy and naturalness of human-robot interactions improve. This study aims to derive optimized facial expression and gesture designs for social robots interacting with elderly individuals, thereby enhancing emotional interactions. First, we utilized user-robot integrated scenarios to identify the emotional states required for robot interactions. Subsequently, we conducted surveys and user preference evaluations on commercially available robot faces. The results indicated that suitable components for robot faces include the eyes, eyebrows, mouth, and cheeks; geometric shapes were deemed the most appropriate. Accordingly, we collected and analyzed human facial expression images using the Facial Action Coding System to identify action unit combinations and facial landmarks. This analysis informed the design of robot faces capable of expressing humanlike emotions. Furthermore, we collected and evaluated human gesture videos representing various emotions to select the most suitable gestures, which were analyzed using motion capture technology. We utilized these data to design robot gestures. The designed robot facial expressions and gestures were validated and refined through emotion-based user preference evaluations. As a result of the study, we developed facial expression and gesture designs for six emotions (Loving, Joyful, Upbeat, Hopeful, Concerned, Grateful) in social robots interacting with elderly individuals. The results provide guidelines for designing human-friendly robot facial expressions and gestures, thus enabling social robots to form deep emotional bonds with users. By analyzing human facial expressions and gestures in relation to emotions and applying these findings to robots, we successfully developed natural and emotionally expressive robot behaviors. These findings contribute to the advancement of robots as reliable and comforting companions for humans. |
---|---|
ISSN: | 2169-3536 |