1/24/2014

Context modulates perception and cortical processing of neutral faces

The vast majority of cognitive neuroscience research on perception of facial expressions has utilized static stimuli with emotional expressions varying in category (e.g., fear, anger, happiness) and magnitude of expressed emotions. Indeed, these studies have provided a lot of important information that has facilitated understanding of how facial stimuli are processed by the brain, however, at the same time it has been assumed that processing of faces, including facial expressions, would be highly automatic and hard-wired in the brain. Recently, it has been increasingly recognized that contextual information affects the perception and interpretation of facial expressions, though it has remained relatively poorly known at which latencies and how robustly contextual information shapes processing of neutral facial stimuli in the human brain.

In their recent study, Wieser et al. (2014) recorded, with electroencephalogram, event-related brain responses to neutral facial stimuli when they were preceded by contextual valence information that was either self-relevant or other-relevant (i.e., brief verbal descriptions of neutral, positive or negative valence, in separate sentences either in first-person or third-person case, as in “my pain” vs. “his pain”, respectively). The authors observed that event-related responses associated with emotional processing were modulated by both types of contextual information from 220 ms post-stimulus. The subjective perception of the affective state of the neutral faces was also shaped by the brief affective descriptions that preceded presentation of the neutral facial stimuli.

Taken together, these findings very nicely demonstrate how contextual second-hand type of information (i.e., what people are told about others), enhances cortical processing of facial stimuli, starting as early as >200 ms from onset of the facial stimuli, even though the facial stimuli themselves were completely neutral without any emotional or self-referential information. The authors conclude that the very perception of facial features is modulated by prior second-hand information that one has about another person, a finding which might in part help explain how initial impressions of others are formed.


Reference: Wieser MJ, Gerdes ABM, B├╝ngel I, Schwarz KA, M├╝hlberger A, Pauli P. Not so harmless anymore: how context impacts the perception and electro-cortical processing of neutral faces. Neuroimage (2014) e-publication ahead of print. http://dx.doi.org/10.1016/j.neuroimage.2014.01.022

No comments:

Post a Comment

Any thoughts on the topic of this blog? You are most welcome to comment, for example, point to additional relevant information and literature on the topic. All comments are checked prior to publication on this site.