Face and Voice Recognition-Based Emotion Analysis System (EAS) to Minimize Heterogeneity in the Metaverse
The metaverse, where users interact through avatars, is evolving to closely mirror the real world, requiring realistic object responses based on users’ emotions. While technologies like eye-tracking and hand-tracking transfer physical movements into virtual spaces, accurate emotion detection remains...
Saved in:
Main Authors: | Surak Son, Yina Jeong |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/2/845 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Gender differences in the recognition of emotional faces: are men less efficient?
by: Ana Ruiz-Ibáñez, et al.
Published: (2017-06-01) -
Eliciting Emotions: Investigating the Use of Generative AI and Facial Muscle Activation in Children’s Emotional Recognition
by: Manuel A. Solis-Arrazola, et al.
Published: (2025-01-01) -
Benchmarking human face similarity using identical twins
by: Shoaib Meraj Sami, et al.
Published: (2022-09-01) -
The effect of mild-stage Alzheimer’s disease on the acoustic parameters of voice
by: Emel Arslan-Sarımehmetoğlu, et al.
Published: (2025-02-01) -
Robust CNN for facial emotion recognition and real-time GUI
by: Imad Ali, et al.
Published: (2024-05-01)