Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
We have developed an intelligent agent to engage with users in virtual drama improvisation previously. The intelligent agent was able to perform sentence-level affect detection from user inputs with strong emotional indicators. However, we noticed that many inputs with weak or no affect indicators a...
Saved in:
Main Authors: | Li Zhang, Bryan Yap |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2012-01-01
|
Series: | Advances in Human-Computer Interaction |
Online Access: | http://dx.doi.org/10.1155/2012/461247 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
An Improved Gesture Segmentation Method for Gesture Recognition Based on CNN and YCbCr
by: Yan Luo, et al.
Published: (2021-01-01) -
Human Motion Gesture Recognition Based on Computer Vision
by: Rui Ma, et al.
Published: (2021-01-01) -
AirStrum: A virtual guitar using real-time hand gesture recognition and strumming technique
by: Beulah ARUL, et al.
Published: (2024-12-01) -
EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text
by: Alena Neviarouskaya, et al.
Published: (2010-01-01) -
Static Hand Gesture Recognition Based on Convolutional Neural Networks
by: Raimundo F. Pinto, et al.
Published: (2019-01-01)