Katharina Dobs
Alumni of the Group Cognitive Engineering
Alumni of the Group Recognition and Categorization
Main Focus
Research Group:
Supervisors: and
I am interested in the visual perception of dynamic faces and its role for identity processing. The faces we encounter everyday are typically in motion, thus facial motion is assumed to contribute at least to some extent to the processing of identity. However, there are still many open questions remaining to bridge the gap between the well-studied static faces and the less well-understood processing of dynamic faces.
During my PhD, I investigated the role of facial motion and its interaction with facial form in human face processing using psychophysics and fMRI.
Investigating identity information in facial motion
Introduction
The faces we encounter everyday typically move. Previous studies have shown that facial motion - in addition to facial form - can carry information about the identity of a person [1,2], yet the exact role of facial motion as a cue for identity is still unclear [3].
Goals
The overall goal is to understand when and how facial motion contributes to person recognition. We hypothesize that humans sensitivity to identity information in facial motion varies depending on the type of facial movement (e.g. basic emotions or conversational expressions). The results shall further advance our understanding of how we perceive faces in real life.
Methods
We assessed human observers sensitivity to identity information in different types of facial movements. To separate form from motion cues, we used a recent facial motion capture and animation system [4,5] and animated a single avatar head with facial movements recorded from four different actors. The facial movements occurred in three social contexts: (1) emotional (e.g., anger), (2) emotional in a social interaction (e.g., being angry at someone) and (3) social interaction (e.g., saying goodbye to someone). Using a delayed matching-to-sample task (see Fig. 1), we tested in which context human observers can discriminate unfamiliar persons based only on their facial motion.
Fig. 1: The trial procedure of the experiment; exemplarily shown for the emotional context. Observers first watched an animation of a facial expression (Sample; e.g., angry), followed by two animations displaying a different facial movement (Matching stimuli; e.g., happy). Observers were asked to choose which of the matching stimuli was performed by the same actor as the sample.
Results
Observers were able to discriminate identities based on emotional facial movements occurring in a social interaction (Fig. 2, middle), but not on basic emotional facial expressions (Fig. 2, left). Sensitivity was highest across non-emotional, speech-related movements occurring in a social interaction (Fig. 2, right).
Fig. 2: Behavioral results. Mean sensitivity (d) across observers (n = 14) as a function of context. A sensitivity of 0 indicates chance level. Error bars indicate 95% confidence interval (CI).
Conclusion
Our findings reveal that human observers can recognize unfamiliar persons from conversational and speech-related movements but not from the way they perform basic emotional facial expressions. We hypothesize that these differences are due to how these movements are executed: basic emotions are performed quite stereotypically, whereas conversational and speech-related movements are performed more idiosyncratically.
References
[1] Hill H and Johnston A (2001). Categorizing sex and identity from the biological motion of faces. Current Biology 11 880-885.
[2] Knappmeyer B, Thornton IM and Bülthoff HH (2003). The use of facial motion and facial form during the processing of identity. Vision Research 43 1921-1936.
[3] OToole A.J, Roark DA and Abdi H (2002). Recognizing moving faces: A psychological and neural synthesis. Trends in Cognitive Sciences 6 261266.
[4] Curio C, Breidt M, Kleiner M, Vuong QC, Giese MA and Bülthoff HH (2006). Semantic 3D motion retargeting for facial animation. 3rd Symposium on Applied Perception in Graphics and Visualization (APGV '06), ACM Press, New York, NY, USA, 77-84.
[5] Dobs K, Bülthoff I, Breidt M, Vuong QC, Curio C and Schultz J (2014). Quantifying human sensitivity to spatio-temporal information in dynamic faces. Vision Research 100 78-87.
Curriculum Vitae
Current Position
since 2015: Postdoctoral Researcher at the Brain and Cognition Research Center (CerCo), CNRS, Toulouse, France. Advisor: Leila Reddy
Education
2010 - 2014: Ph.D. Candidate at Max Planck Institute for Biological Cybernetics, Tübingen, Germany (Dept. Human Perception, Cognition and Action). Advisors: Isabelle Bülthoff, Johannes Schultz
2002 - 2008: Diploma Psychology, Philipps-University Marburg, Germany. Advisors: Frank Rösler, Kerstin Jost
2002 - 2007: Diploma Computer Science, Philipps-University Marburg, Germany. Advisors: Manfred Sommer, David Kämpf
Research and Teaching Experience
2014 - 2015: Postdoctoral researcher / guest scientist at the Max Planck Institute for Biological Cybernetics. Advisor: Isabelle Bülthoff
2013: JSPS fellow at Gardner Research Team, RIKEN BSI, Japan, conducting an fMRI study on attentional modulation of facial motion and form processing. Advisor: Justin Gardner
2011: Supervised Kathryn Bonnen (Michigan State University) working on "Physical and perceptual analysis of the 3D face database" as an internship
2004 2008: Student Research Assistant at the Cognitive Psychophysiology Lab, Philipps-University of Marburg, Germany. Advisor: Frank Rösler
2006: Visiting Research Assistant at the Laboratory of Systems Neurodynamics, University of Virginia, USA. Advisor: William B Levy
Fellowships, Grants and Awards
2015 - 2017: Postdoctoral Fellowship of the German Research Foundation (DFG)
2015: Best Dissertation Award 2015 from the Max Planck Institute for Biological Cybernetics and Förderverein für neurowissenschaftliche Forschung e.V.
2015: Travel award from the German Academic Exchange Service
2014: Invited speaker at the symposium "The perception of faces" in English Lake District, UK, funded by the Rank Prize Funds.
2013: JSPS (Japan Society for the Promoting of Science) Research Fellowship
2012: VSS Student Travel Award winner
Work Experience
2009 - 2010: IT Consultant / Software Engineer at PRODYNA AG, Frankfurt, Germany
2008 - 2009: Freelancer / Software Engineer in London, GB