Creating realistic laughter through facial expressions for affective embodied conversational agents

Date of Publication

2013

Document Type

Master's Thesis

Degree Name

Master of Science in Computer Science

College

College of Computer Studies

Department/Unit

Computer Science

Abstract/Summary

Most Affective Embodied Conversational Agents (ECAs) of today employ several modalities of interaction. They can use verbal modalities such as text and voice to communicate with the user. They can also use non-verbal modalities such as facial expressions and gestures to express themselves. However, these systems lack an important element of communication, which is laughter. Laughter is an important communicative signal because of its usage as feedback in social discourse and because it is a means of showing our emotion. Research shows that the non-verbal aspect of laughter is composed mainly of facial expressions. However, laughter facial expressions in ECA still lack important details such as eye movement, cheek movement and wrinkles. Furthermore, laughter is linked to several emotional states such as joy, amusement, and relief. However, no past research has attempted to model synthetic emotional laughter. This research presents an affective laughter facial expression synthesis system for ECAs. The purpose of this system is to allow an ECA to perform realistic and emotionally appropriate laughter during a conversation.

Abstract Format

html

Language

English

Format

Print

Accession Number

TG05324

Shelf Location

Archives, The Learning Commons, 12F Henry Sy Sr. Hall

Physical Description

xii, 115 leaves ; 28 cm. + 1 computer optical disc.

This document is currently not available here.

Share

COinS