1/24/2014

Context modulates perception and cortical processing of neutral faces

The vast majority of cognitive neuroscience research on perception of facial expressions has utilized static stimuli with emotional expressions varying in category (e.g., fear, anger, happiness) and magnitude of expressed emotions. Indeed, these studies have provided a lot of important information that has facilitated understanding of how facial stimuli are processed by the brain, however, at the same time it has been assumed that processing of faces, including facial expressions, would be highly automatic and hard-wired in the brain. Recently, it has been increasingly recognized that contextual information affects the perception and interpretation of facial expressions, though it has remained relatively poorly known at which latencies and how robustly contextual information shapes processing of neutral facial stimuli in the human brain.

In their recent study, Wieser et al. (2014) recorded, with electroencephalogram, event-related brain responses to neutral facial stimuli when they were preceded by contextual valence information that was either self-relevant or other-relevant (i.e., brief verbal descriptions of neutral, positive or negative valence, in separate sentences either in first-person or third-person case, as in “my pain” vs. “his pain”, respectively). The authors observed that event-related responses associated with emotional processing were modulated by both types of contextual information from 220 ms post-stimulus. The subjective perception of the affective state of the neutral faces was also shaped by the brief affective descriptions that preceded presentation of the neutral facial stimuli.

Taken together, these findings very nicely demonstrate how contextual second-hand type of information (i.e., what people are told about others), enhances cortical processing of facial stimuli, starting as early as >200 ms from onset of the facial stimuli, even though the facial stimuli themselves were completely neutral without any emotional or self-referential information. The authors conclude that the very perception of facial features is modulated by prior second-hand information that one has about another person, a finding which might in part help explain how initial impressions of others are formed.


Reference: Wieser MJ, Gerdes ABM, Büngel I, Schwarz KA, Mühlberger A, Pauli P. Not so harmless anymore: how context impacts the perception and electro-cortical processing of neutral faces. Neuroimage (2014) e-publication ahead of print. http://dx.doi.org/10.1016/j.neuroimage.2014.01.022

1/15/2014

Natural sounds are represented as spectrotemporal modulations in human auditory cortex

The question of how the human auditory cortex represents complex natural sounds is one of the most fundamental ones in cognitive neuroscience. While previous studies have documented a number of tonotopically organized areas occupying the primary and non-primary auditory cortices, there are additionally studies that have shown preference to other sound features, such as sound location and speech sound category, in specific auditory cortical areas. Furthermore, there are findings in animal models suggesting that auditory cortical neurons are selective to various types of spectrotemporal sound features, however, it has not been known whether there are topographic representations of spectrotemporal features, a model that could potentially explain how complex natural sounds are represented in the human auditory cortex. 

In their recent study, Santoro et al. (2014), analyzed data from two previous functional magnetic resonance imaging experiments where a rich array of natural sounds had been presented to healthy volunteers. They then tested between three computational models, where the first model assumed that auditory cortex represents sounds as spectral/frequency information, the second model assumed that auditory cortex represents sounds as temporal information and third model assumed that sounds are represented as sets of spectrotemporal modulations. The results indicate that natural sounds are represented with frequency-specific analysis of spectrotemporal modulations. Furthermore, in anterior auditory cortex regions analysis of spectral information was found to be more fine-grained than in posterior auditory cortical areas, wherein temporal information was, in turn, found to be represented more accurately with rather coarse representation of spectral information.

In sum, the authors provide a very exciting approach to testing how well alternative computational models, inspired by neurophysiological findings obtained in animal research, can predict hemodynamic data collected during presentation of a various natural sounds. Taken together, the results offer a very interesting vantage point into how natural sounds could be represented in the human auditory cortex. It is easy to predict that the approach and findings will generate wide interest and help further research efforts to significantly step forward, especially given the increasing popularity of the use of naturalistic stimuli in neuroimaging research.


Reference: Santoro R, Moerel M, De Martino F, Goebel R, Ugurbil K, Yacoub E, Formisano E. Encoding of natural sounds at multiple spectral and temporal resolutions in the human auditory cortex. PLoS Computational Biology 10: e1003412. http://dx.doi.org/10.1371/journal.pcbi.1003412

1/07/2014

Across-cultures replicable bodily maps of experienced emotions

In linguistic expressions, emotional experiences are often described as bodily sensations, such as someone “having cold feet” or “heartache” that can be surprisingly similar across different cultures and languages. Furthermore, in cognitive neuroscience theories of emotions, somatosensory feedback has been proposed to support conscious emotional experiences. On the other hand, there are classical findings indicating that it is difficult to classify emotional states (other than changes in the level of arousal) based on measures of autonomic nervous system activity. Somewhat surprisingly, the question of whether emotional experiences during different emotional states (e.g., anger, sadness, happiness) are associated with distinct patterns of bodily sensations has not been addressed empirically.

In their recent study, Nummenmaa et al. (2013) conducted a series of five closely related experiments where a total of 701 participants were presented outlines of bodies along with emotional stimuli of different types and were asked to color bodily regions in the outlines where they felt increasing or decreasing activity while experiencing different kinds of emotions. The authors observed that different emotions were associated with across-stimulus-type replicable patterns of bodily sensations as indicated by the coloring patterns. These patterns of bodily sensations further replicated across Finnish and Swedish speaking subjects, as well as Taiwanese subjects tested in a separate control study. Based on these findings, the authors propose that emotions are represented in the somatosensory system as culturally universal categorical somatotopic maps that contribute to conscious emotional experiences.


Reference: Nummenmaa L, Glerean E, Hari R, Hietanen JK. Bodily maps of emotions. Proceedings of the National Academy of Sciences USA (2013) e-publication ahead of print. http://dx.doi.org/10.1073/pnas.1321664111