Reading facial expressions from real and virtual (AI) humans

This project aims to advance understanding of human emotional communication and improve human rapport with virtual humans and avatars.

school Student intake
This project is open for Honours, Masters, MPhil and PhD students.
label Research theme
traffic Project status

Project status

Current

Content navigation

About

A face that was generated by AI

Virtual humans—like the one shown alongside this text—are rapidly infiltrating our online social worlds. 

This project aims to advance understanding of human emotional communication and improve human rapport with virtual humans and avatars. 

This research program is interested in questions like: What social roles can virtual beings successfully fulfil? How do human responses differ for virtual compared to human faces? What physical information in faces causes expressions to be perceived as genuinely emotional? Is this information the same for virtual as for human faces? 

The program also includes a specific focus on AI faces, asking questions like: How do AI faces differ from human faces? Can people detect AI faces, and do they have insight into this ability? How do training biases in algorithms affect AI faces? How does psychological theory help us understand AI faces and AI face perception?

Members

Principal investigator

Amy Dawel

Associate Professor in Pyschology