Social Agent: Facial Expression Driver for an e-Nose

by Widmark, Jörgen

Abstract (Summary)
This thesis describes that it is possible to drive synthetic emotions of an interface agent with an electronic nose system developed at AASS. The e-Nose can be used for quality control, and the detected distortion from a known smell sensation prototype is interpreted to a 3D-representation of emotional states, which in turn points to a set of pre-defined muscle contractions. This extension of a rule based motivation system, which we call Facial Expression Driver, is incorporated to a model for sensor fusion with active perception, to provide a general design for a more complex system with additional senses. To be consistent with the biologically inspired sensor fusion model a muscle based animated facial model was chosen as a test bed for the expression of current emotion. The social agent’s facial expressions demonstrate its tolerance to the detected distortion in order to manipulate the user to restore the system to functional balance. Only a few of the known projects use chemically based sensing to drive a face in real-time, whether they are virtual characters or animatronics. This work may inspire a future android implementation of a head with electro active polymers as synthetic facial muscles.
Bibliographical Information:


School:Linköpings universitet

School Location:Sweden

Source Type:Master's Thesis

Keywords:social intelligent agents user interface electronic nose olfactory sensation active perception synthetic emotions affective computing facial expression driver


Date of Publication:01/01/2003

© 2009 All Rights Reserved.