The integration of facial features over space and time

DSpace Repository


Dokumentart: Dissertation
Date: 2017
Language: English
Faculty: 7 Mathematisch-Naturwissenschaftliche Fakultät
Department: Biologie
Advisor: Bartels, Andreas (Dr.)
Day of Oral Examination: 2016-06-17
DDC Classifikation: 500 - Natural sciences and mathematics
Keywords: Gesicht , Gefühl
License: Publishing license excluding print on demand
Show full item record


Faces are unique social stimuli that can be recognized in an instant. We can pick up information about gender, ethnicity, feelings, attentional focus or even attributes like attractiveness or trustworthiness remarkably quickly. How we achieve this has been subject to psychology, cognitive science and neuroscience since decades but we still don ́t know the full picture. The key theme of this thesis concerns the integration, or binding, of facial features over space and over time. We investigated both behavioral measures in healthy people and a group of people with Autism Spectrum Disorder (ASD), and the neuronal mechanisms in core face processing regions of the human brain. The first part of this thesis investigates the contribution of face responsive brain areas to whole face and part-based neural representation of facial expressions. This aspect has hardly been considered in the past, as most studies focused on the representation of identity instead. During a fmri- experiment, we presented whole faces and facial parts of happy and fearful expressions. We extracted the similarity of activity patterns in core network of face processing - occipital face area (OFA), fusiform face area (FFA) and superior temporal sulcus (STS) - across and within emotions between whole faces and facial parts. Previous studies based on identitity recognition have found holistic and part-based representations in the FFA while the OFA seems to mainly represent part-based information. The STS has hardly been considered in those studies, as it is thought to be preferentially involved to expression coding. We find both part-based representation of facial expressions and an emotion-indepented preference of whole faces in the FFA, in line with the previous findings for identity recognition. For STS, we detect emotion- dependent representations of faces and facial parts, supporting its major role in expression processing. The OFA, in contrast, shows similar representation of the eyes- and mouth-regions of both expressions without any further specific effects, adding evidence to its role as an entry-point of facial information into the core network of face processing.The second part of the thesis explores the temporal information embodied in dynamic facial expressions. Using expressions of increasing and decreasing intensity that were presented in the natural or reversed frame order, we manipulate the temporal information of expression unfolding in a well controlled 2x2 Design (factor “emotion direction” and factor “timeline”). This approach allowed us to nicely control for low-level aspects. In three consecutive studies, we explore first the underlying brain activation elicited by our stimulus manipulation in healthy subjects. Second, we examine the perceptual effects caused by emotion-direction and timeline-reversal in healthy subjects, and third in autistic participants and matched controls. Our results indicate a sensitivity of all areas of the neural core network of face processing to both, emotion-direction and timeline. Behaviorally, we found that both factors affected jugdements of different stimulus properties like emotion intensity or how well emotions are performed, even if subjects were not informed of the timeline manipulation. Interestingly, autistic subjects did not differ from the control group regarding the effects caused by timeline reversal in their perceptual evaluation of the stimuli. In sum, our studies shed light onto two key aspects of facial processing and perception - holistic or part-based processing and facial dynamics - that have not been addressed before in the way done here.

This item appears in the following Collection(s)