Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition

We have developed an intelligent agent to engage with users in virtual drama improvisation previously. The intelligent agent was able to perform sentence-level affect detection from user inputs with strong emotional indicators. However, we noticed that many inputs with weak or no affect indicators a...

Full description

Saved in:
Bibliographic Details
Main Authors: Li Zhang, Bryan Yap
Format: Article
Language:English
Published: Wiley 2012-01-01
Series:Advances in Human-Computer Interaction
Online Access:http://dx.doi.org/10.1155/2012/461247
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832567554060582912
author Li Zhang
Bryan Yap
author_facet Li Zhang
Bryan Yap
author_sort Li Zhang
collection DOAJ
description We have developed an intelligent agent to engage with users in virtual drama improvisation previously. The intelligent agent was able to perform sentence-level affect detection from user inputs with strong emotional indicators. However, we noticed that many inputs with weak or no affect indicators also contain emotional implication but were regarded as neutral expressions by the previous interpretation. In this paper, we employ latent semantic analysis to perform topic theme detection and identify target audiences for such inputs. We also discuss how such semantic interpretation of the dialog contexts is used to interpret affect more appropriately during virtual improvisation. Also, in order to build a reliable affect analyser, it is important to detect and combine weak affect indicators from other channels such as body language. Such emotional body language detection also provides a nonintrusive channel to detect users’ experience without interfering with the primary task. Thus, we also make initial exploration on affect detection from several universally accepted emotional gestures.
format Article
id doaj-art-e2dc64cfc32d4cceac1c1d9e9f239d80
institution Kabale University
issn 1687-5893
1687-5907
language English
publishDate 2012-01-01
publisher Wiley
record_format Article
series Advances in Human-Computer Interaction
spelling doaj-art-e2dc64cfc32d4cceac1c1d9e9f239d802025-02-03T01:01:15ZengWileyAdvances in Human-Computer Interaction1687-58931687-59072012-01-01201210.1155/2012/461247461247Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture RecognitionLi Zhang0Bryan Yap1School of Computing, Engineering & Information Sciences, Northumbria University, Newcastle NE1 8ST, UKDepartment of Mechanical Engineering, University of Bristol, Bristol BS8 1TR, UKWe have developed an intelligent agent to engage with users in virtual drama improvisation previously. The intelligent agent was able to perform sentence-level affect detection from user inputs with strong emotional indicators. However, we noticed that many inputs with weak or no affect indicators also contain emotional implication but were regarded as neutral expressions by the previous interpretation. In this paper, we employ latent semantic analysis to perform topic theme detection and identify target audiences for such inputs. We also discuss how such semantic interpretation of the dialog contexts is used to interpret affect more appropriately during virtual improvisation. Also, in order to build a reliable affect analyser, it is important to detect and combine weak affect indicators from other channels such as body language. Such emotional body language detection also provides a nonintrusive channel to detect users’ experience without interfering with the primary task. Thus, we also make initial exploration on affect detection from several universally accepted emotional gestures.http://dx.doi.org/10.1155/2012/461247
spellingShingle Li Zhang
Bryan Yap
Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
Advances in Human-Computer Interaction
title Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
title_full Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
title_fullStr Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
title_full_unstemmed Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
title_short Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
title_sort affect detection from text based virtual improvisation and emotional gesture recognition
url http://dx.doi.org/10.1155/2012/461247
work_keys_str_mv AT lizhang affectdetectionfromtextbasedvirtualimprovisationandemotionalgesturerecognition
AT bryanyap affectdetectionfromtextbasedvirtualimprovisationandemotionalgesturerecognition