Speaker: Zhengkun Shang, Master's Candidate
Emotions are an essential part of human social interactions. By integrating an automatic affect recognizer into an artificial system, the system can detect humans’ emotions and provide personal responses. We aim to build a prompting system that uses a virtual human with emotional interaction capabilities to help persons with a cognitive disability to complete daily activities independently. In this thesis, we work on automatic affect recognition and compare three different types of feature descriptors with support vector machine regression (SVR) and bidirectional long short-term memory (BLSTM) to predict users’ emotions in three-dimensional space. We demonstrate the feasibility of further building artificial systems that track users’ real-time emotions through BayesACT simulations, a probabilistic and decision-theoretic generalization of Affect Control Theory that learns users’ fundamental sentiments during interactions. We would like to understand given virtual humans with distinct emotion characteristics, how and to what extent the user’s emotions are affected. In the end, we integrate the affect recognition module into an iterated prisoner’s dilemma game, in which a user can play the game against a virtual hu- man. We let a number of participants play the game and test if different facial expressions change the virtual human’s strategies during the game.