EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text
The 3D virtual world of “Second Life” imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all ad...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2010-01-01
|
Series: | Advances in Human-Computer Interaction |
Online Access: | http://dx.doi.org/10.1155/2010/209801 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832567922513412096 |
---|---|
author | Alena Neviarouskaya Helmut Prendinger Mitsuru Ishizuka |
author_facet | Alena Neviarouskaya Helmut Prendinger Mitsuru Ishizuka |
author_sort | Alena Neviarouskaya |
collection | DOAJ |
description | The 3D virtual world of “Second Life” imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all adventures of virtual reality. Undoubtedly, emotions play a powerful role in communication. However, to trigger visual display of user's affective state in a virtual world, user has to manually assign appropriate facial expression or gesture to own avatar. Affect sensing from text, which enables automatic expression of emotions in the virtual environment, is a method to avoid manual control by the user and to enrich remote communications effortlessly. In this paper, we describe a lexical rule-based approach to recognition of emotions from text and an application of the developed Affect Analysis Model in Second Life. Based on the result of the Affect Analysis Model, the developed EmoHeart (“object” in Second Life) triggers animations of avatar facial expressions and visualizes emotion by heart-shaped textures. |
format | Article |
id | doaj-art-2699820267b04bb39eeb2a78af255c6e |
institution | Kabale University |
issn | 1687-5893 1687-5907 |
language | English |
publishDate | 2010-01-01 |
publisher | Wiley |
record_format | Article |
series | Advances in Human-Computer Interaction |
spelling | doaj-art-2699820267b04bb39eeb2a78af255c6e2025-02-03T01:00:04ZengWileyAdvances in Human-Computer Interaction1687-58931687-59072010-01-01201010.1155/2010/209801209801EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from TextAlena Neviarouskaya0Helmut Prendinger1Mitsuru Ishizuka2Department of Information and Communication Engineering, University of Tokyo, R. 111C1/111D2, Engineering Building 2, 11th floor, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, JapanDigital Content and Media Sciences Research Division, National Institute of Informatics, 1613-1B, 16 floor, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, JapanDepartment of Information and Communication Engineering, University of Tokyo, R. 111C1/111D2, Engineering Building 2, 11th floor, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, JapanThe 3D virtual world of “Second Life” imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all adventures of virtual reality. Undoubtedly, emotions play a powerful role in communication. However, to trigger visual display of user's affective state in a virtual world, user has to manually assign appropriate facial expression or gesture to own avatar. Affect sensing from text, which enables automatic expression of emotions in the virtual environment, is a method to avoid manual control by the user and to enrich remote communications effortlessly. In this paper, we describe a lexical rule-based approach to recognition of emotions from text and an application of the developed Affect Analysis Model in Second Life. Based on the result of the Affect Analysis Model, the developed EmoHeart (“object” in Second Life) triggers animations of avatar facial expressions and visualizes emotion by heart-shaped textures.http://dx.doi.org/10.1155/2010/209801 |
spellingShingle | Alena Neviarouskaya Helmut Prendinger Mitsuru Ishizuka EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text Advances in Human-Computer Interaction |
title | EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text |
title_full | EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text |
title_fullStr | EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text |
title_full_unstemmed | EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text |
title_short | EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text |
title_sort | emoheart conveying emotions in second life based on affect sensing from text |
url | http://dx.doi.org/10.1155/2010/209801 |
work_keys_str_mv | AT alenaneviarouskaya emoheartconveyingemotionsinsecondlifebasedonaffectsensingfromtext AT helmutprendinger emoheartconveyingemotionsinsecondlifebasedonaffectsensingfromtext AT mitsuruishizuka emoheartconveyingemotionsinsecondlifebasedonaffectsensingfromtext |