An android can show the facial expressions of complex emotions

Abstract Trust and rapport are essential abilities for human–robot interaction. Producing emotional expressions in the robots’ faces is an effective way for that purpose. Androids can show human-like facial expressions of basic emotions. However, whether androids can show the facial expression of co...

Full description

Saved in:
Bibliographic Details
Main Authors: Alexander Diel, Wataru Sato, Chun-Ting Hsu, Alexander Bäuerle, Martin Teufel, Takashi Minato
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-024-84224-3
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832594856588869632
author Alexander Diel
Wataru Sato
Chun-Ting Hsu
Alexander Bäuerle
Martin Teufel
Takashi Minato
author_facet Alexander Diel
Wataru Sato
Chun-Ting Hsu
Alexander Bäuerle
Martin Teufel
Takashi Minato
author_sort Alexander Diel
collection DOAJ
description Abstract Trust and rapport are essential abilities for human–robot interaction. Producing emotional expressions in the robots’ faces is an effective way for that purpose. Androids can show human-like facial expressions of basic emotions. However, whether androids can show the facial expression of complex emotions remains unknown. In this experiment, we investigated the android Nikola’s ability to produce 22 dynamic facial expressions of complex emotions. For each video, 240 international participants (120 Japanese, 120 German) rated the emotions expressed by Nikola. For 13 complex emotions (i.e., amusement, appal, awe, boredom, contentment, coyness, hatred, hesitation, moral disgust, not face, pain, sleepiness, suspicion), participants of both samples rated the target emotion above the mean of other non-target emotions. Four emotions (bitterness, confusion, pride, relief) were rated above mean by one sample. For twelve of these emotions, target emotions were among the highest ranked. The results suggest that androids can produce the facial expressions of a wide range of complex emotions, which can facilitate human–robot interactions.
format Article
id doaj-art-be6f3f8ccfbe4f2a91de415da834ce09
institution Kabale University
issn 2045-2322
language English
publishDate 2025-01-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-be6f3f8ccfbe4f2a91de415da834ce092025-01-19T12:18:37ZengNature PortfolioScientific Reports2045-23222025-01-0115111110.1038/s41598-024-84224-3An android can show the facial expressions of complex emotionsAlexander Diel0Wataru Sato1Chun-Ting Hsu2Alexander Bäuerle3Martin Teufel4Takashi Minato5Clinic for Psychosomatic Medicine and Psychotherapy, LVR-University Hospital Essen, University of Duisburg-EssenGuardian Robot Project, RIKENGuardian Robot Project, RIKENClinic for Psychosomatic Medicine and Psychotherapy, LVR-University Hospital Essen, University of Duisburg-EssenClinic for Psychosomatic Medicine and Psychotherapy, LVR-University Hospital Essen, University of Duisburg-EssenGuardian Robot Project, RIKENAbstract Trust and rapport are essential abilities for human–robot interaction. Producing emotional expressions in the robots’ faces is an effective way for that purpose. Androids can show human-like facial expressions of basic emotions. However, whether androids can show the facial expression of complex emotions remains unknown. In this experiment, we investigated the android Nikola’s ability to produce 22 dynamic facial expressions of complex emotions. For each video, 240 international participants (120 Japanese, 120 German) rated the emotions expressed by Nikola. For 13 complex emotions (i.e., amusement, appal, awe, boredom, contentment, coyness, hatred, hesitation, moral disgust, not face, pain, sleepiness, suspicion), participants of both samples rated the target emotion above the mean of other non-target emotions. Four emotions (bitterness, confusion, pride, relief) were rated above mean by one sample. For twelve of these emotions, target emotions were among the highest ranked. The results suggest that androids can produce the facial expressions of a wide range of complex emotions, which can facilitate human–robot interactions.https://doi.org/10.1038/s41598-024-84224-3Affective computingAndroidDynamic face emotion expressionsSecondary emotions
spellingShingle Alexander Diel
Wataru Sato
Chun-Ting Hsu
Alexander Bäuerle
Martin Teufel
Takashi Minato
An android can show the facial expressions of complex emotions
Scientific Reports
Affective computing
Android
Dynamic face emotion expressions
Secondary emotions
title An android can show the facial expressions of complex emotions
title_full An android can show the facial expressions of complex emotions
title_fullStr An android can show the facial expressions of complex emotions
title_full_unstemmed An android can show the facial expressions of complex emotions
title_short An android can show the facial expressions of complex emotions
title_sort android can show the facial expressions of complex emotions
topic Affective computing
Android
Dynamic face emotion expressions
Secondary emotions
url https://doi.org/10.1038/s41598-024-84224-3
work_keys_str_mv AT alexanderdiel anandroidcanshowthefacialexpressionsofcomplexemotions
AT watarusato anandroidcanshowthefacialexpressionsofcomplexemotions
AT chuntinghsu anandroidcanshowthefacialexpressionsofcomplexemotions
AT alexanderbauerle anandroidcanshowthefacialexpressionsofcomplexemotions
AT martinteufel anandroidcanshowthefacialexpressionsofcomplexemotions
AT takashiminato anandroidcanshowthefacialexpressionsofcomplexemotions
AT alexanderdiel androidcanshowthefacialexpressionsofcomplexemotions
AT watarusato androidcanshowthefacialexpressionsofcomplexemotions
AT chuntinghsu androidcanshowthefacialexpressionsofcomplexemotions
AT alexanderbauerle androidcanshowthefacialexpressionsofcomplexemotions
AT martinteufel androidcanshowthefacialexpressionsofcomplexemotions
AT takashiminato androidcanshowthefacialexpressionsofcomplexemotions