Reading facial expressions from real and virtual (AI) humans
This project aims to advance understanding of human emotional communication and improve human rapport with virtual humans and avatars.
Research themes
Project status
Content navigation
About
Virtual humans—like the one shown alongside this text—are rapidly infiltrating our online social worlds.
This project aims to advance understanding of human emotional communication and improve human rapport with virtual humans and avatars.
This research program is interested in questions like: What social roles can virtual beings successfully fulfil? How do human responses differ for virtual compared to human faces? What physical information in faces causes expressions to be perceived as genuinely emotional? Is this information the same for virtual as for human faces?
The program also includes a specific focus on AI faces, asking questions like: How do AI faces differ from human faces? Can people detect AI faces, and do they have insight into this ability? How do training biases in algorithms affect AI faces? How does psychological theory help us understand AI faces and AI face perception?
Members
Principal investigator
Other members
- Prof Romina Palermo
- Dr Clare Sutherland
- A/Prof Jason Bell
- A/Prof Eva Krumhuber