Effects of damping head movement and facial expression in dyadic conversation using real-time facial expression tracking and synthesized avatars

Steven M. Boker, Jeffrey F. Cohn, Barry John Theobald, Iain Matthews, Timothy R. Brick, Jeffrey R. Spies

Research output: Contribution to journalArticle

47 Scopus citations


When people speak with one another, they tend to adapt their head movements and facial expressions in response to each others' head movements and facial expressions. We present an experiment in which confederates' head movements and facial expressions were motion tracked during videoconference conversations, an avatar face was reconstructed in real time, and naive participants spoke with the avatar face. No naive participant guessed that the computer generated face was not video. Confederates' facial expressions, vocal inflections and head movements were attenuated at 1 min intervals in a fully crossed experimental design. Attenuated head movements led to increased head nods and lateral head turns, and attenuated facial expressions led to increased head nodding in both naive participants and confederates. Together, these results are consistent with a hypothesis that the dynamics of head movements in dyadicconversation include a shared equilibrium. Although both conversational partners were blind to the manipulation, when apparent head movement of one conversant was attenuated, both partners responded by increasing the velocity of their head movements.

Original languageEnglish (US)
Pages (from-to)3485-3495
Number of pages11
JournalPhilosophical Transactions of the Royal Society B: Biological Sciences
Issue number1535
StatePublished - Dec 12 2009


All Science Journal Classification (ASJC) codes

  • Biochemistry, Genetics and Molecular Biology(all)
  • Agricultural and Biological Sciences(all)

Cite this