5/26/2014

Green tea consumption is associated with lower incidence of mild cognitive impairment and dementia in elderly people

Mitigating effects of dietary habits, including drinking of tea or coffee, on cognitive decline (and even dementias) in aging is a research topic with potentially very high societal impact. While coffee and tea do contain large amounts of polyphenols and caffeine that have potential neuroprotective effects, previous studies on the relationships between coffee and tea consumption and dementia seem to have produced mixed results. In their recent population-based longitudial study, Dr Moeko Noguchi-Shinohara et al. (2014) inspected the relationships between coffee, black tea, and green tea consumption and incidence of dementia and mild cognitive impairment.

Out of a total of 2845 residents aged >60 years in 2007 in Nakajima, Japan, 723 individuals meeting criteria for inclusion voluntarily participated in the study. Cognitive level was tested using mini-mental state examination and clinical dementia rating scales. Health surveys and blood tests were also carried out to control for some of the potentially intervening variables such as ApoE phenotype status and diabetes. Consumption of coffee, black tea, and green tea was recorded and divided into three classes for the purposes of data analysis: zero consumption, 1-6 days/week, and every day. At the time of follow-up testing conducted on the average 4.9 years later, it was observed that frequent consumption of green tea (but neither black tea nor coffee) was associated with significantly lower incidence of dementia and mild cognitive impairment.

The authors propose that these interesting findings could be due to a number of factors. One of the possible mechanisms that they bring up is that, unlike black tea, green tea contains catechins, especially epigallo catechin 3-gallate, as well as myricetin, which both have been described to have neuroprotective effects. The authors further remind that higher physical activity and number of hobbies also correlated with green tea consumption, although the beneficial effects of green tea prevailed even when these factors were taken into account in the analysis. Taken together, these findings add to the pool of evidence suggesting that green tea might have some neuroprotective effects that help guard against aging-related cognitive decline.


Reference: Noguchi-Shinohara M, Yuki S, Dohmoto C, Ikeda Y, Samuraki M, Iwasa K, Yokogawa M, Asai K, Komai K, Nakumura H, Yamada M. Consumption of green tea, but not black tea or coffee, is associated with reduced risk of cognitive decline. PLoS ONE (2014) 9: e96013. http://dx.doi.org/10.1371/journal.pone.0096013

2/13/2014

Brain activity patterns predict risky and safe choices in healthy human volunteers

The question of which neural events predict risky vs. safe behaviors such as overtaking a slower vehicle when there is little space to do so due to oncoming traffic vs. driving behind the slower vehicle and arriving a few minutes later to work is a highly interesting and important one. The vast majority of neuroimaging studies investigating the neural basis of risk taking have utilized models adapted from economics, in which risks are defined as the degree of variance in outcomes, however, it has been argued that for lay persons risk equals being exposed to a potential loss.

In their recent study, Dr. Helfinstein et al. (2014) had 108 healthy volunteers engage in a task called Balloon analog risk task during functional magnetic resonance imaging. In this task, the subjects earn points when they pump up balloons, but lose the points if the balloon explodes before they “cash out” by stopping pumping. This task was selected because it is highly correlated with public health relevant risk taking behaviors, including unsafe driving, sexual risk taking, and drug use. The authors observed that multi-voxel pattern analysis of brain activity before the point of decision making predicted subsequent risky vs. safe choices by the subjects, specifically involving brain regions found in previous studies to participate in control functions. Interestingly, in a separate univariate analysis these areas were found more active before safe than risky choices.

These highly interesting findings show that it is possible to predict risky vs. safe choices based on preceding patterns of brain activity in a set of regions that have been previously shown to be activated during tasks requiring cognitive control. The fact that these areas were more strongly activated preceding safe than risky decisions suggests that increased risk taking might be due to failures in engaging appropriate cognitive control processes. The relevance of these findings is further augmented given that the Balloon analog risk task that was used has been found in previous studies to correlate highly with real-life risk taking behaviors relevant for public health such as unsafe driving and drug use.


Reference: Helfinstein SM,  Schonberg T, Congdon E, Karlsgodt KH, Mumford JA, Sabb FW, Cannon TD, London ED, Bilder BM, Poldrack RA. Predicting risky choices from brain activity patterns. Proc Natl Acad Sci USA (2014) e-publication ahead of print. http://dx.doi.org/10.1073/pnas.1321728111

1/24/2014

Context modulates perception and cortical processing of neutral faces

The vast majority of cognitive neuroscience research on perception of facial expressions has utilized static stimuli with emotional expressions varying in category (e.g., fear, anger, happiness) and magnitude of expressed emotions. Indeed, these studies have provided a lot of important information that has facilitated understanding of how facial stimuli are processed by the brain, however, at the same time it has been assumed that processing of faces, including facial expressions, would be highly automatic and hard-wired in the brain. Recently, it has been increasingly recognized that contextual information affects the perception and interpretation of facial expressions, though it has remained relatively poorly known at which latencies and how robustly contextual information shapes processing of neutral facial stimuli in the human brain.

In their recent study, Wieser et al. (2014) recorded, with electroencephalogram, event-related brain responses to neutral facial stimuli when they were preceded by contextual valence information that was either self-relevant or other-relevant (i.e., brief verbal descriptions of neutral, positive or negative valence, in separate sentences either in first-person or third-person case, as in “my pain” vs. “his pain”, respectively). The authors observed that event-related responses associated with emotional processing were modulated by both types of contextual information from 220 ms post-stimulus. The subjective perception of the affective state of the neutral faces was also shaped by the brief affective descriptions that preceded presentation of the neutral facial stimuli.

Taken together, these findings very nicely demonstrate how contextual second-hand type of information (i.e., what people are told about others), enhances cortical processing of facial stimuli, starting as early as >200 ms from onset of the facial stimuli, even though the facial stimuli themselves were completely neutral without any emotional or self-referential information. The authors conclude that the very perception of facial features is modulated by prior second-hand information that one has about another person, a finding which might in part help explain how initial impressions of others are formed.


Reference: Wieser MJ, Gerdes ABM, Büngel I, Schwarz KA, Mühlberger A, Pauli P. Not so harmless anymore: how context impacts the perception and electro-cortical processing of neutral faces. Neuroimage (2014) e-publication ahead of print. http://dx.doi.org/10.1016/j.neuroimage.2014.01.022

1/15/2014

Natural sounds are represented as spectrotemporal modulations in human auditory cortex

The question of how the human auditory cortex represents complex natural sounds is one of the most fundamental ones in cognitive neuroscience. While previous studies have documented a number of tonotopically organized areas occupying the primary and non-primary auditory cortices, there are additionally studies that have shown preference to other sound features, such as sound location and speech sound category, in specific auditory cortical areas. Furthermore, there are findings in animal models suggesting that auditory cortical neurons are selective to various types of spectrotemporal sound features, however, it has not been known whether there are topographic representations of spectrotemporal features, a model that could potentially explain how complex natural sounds are represented in the human auditory cortex. 

In their recent study, Santoro et al. (2014), analyzed data from two previous functional magnetic resonance imaging experiments where a rich array of natural sounds had been presented to healthy volunteers. They then tested between three computational models, where the first model assumed that auditory cortex represents sounds as spectral/frequency information, the second model assumed that auditory cortex represents sounds as temporal information and third model assumed that sounds are represented as sets of spectrotemporal modulations. The results indicate that natural sounds are represented with frequency-specific analysis of spectrotemporal modulations. Furthermore, in anterior auditory cortex regions analysis of spectral information was found to be more fine-grained than in posterior auditory cortical areas, wherein temporal information was, in turn, found to be represented more accurately with rather coarse representation of spectral information.

In sum, the authors provide a very exciting approach to testing how well alternative computational models, inspired by neurophysiological findings obtained in animal research, can predict hemodynamic data collected during presentation of a various natural sounds. Taken together, the results offer a very interesting vantage point into how natural sounds could be represented in the human auditory cortex. It is easy to predict that the approach and findings will generate wide interest and help further research efforts to significantly step forward, especially given the increasing popularity of the use of naturalistic stimuli in neuroimaging research.


Reference: Santoro R, Moerel M, De Martino F, Goebel R, Ugurbil K, Yacoub E, Formisano E. Encoding of natural sounds at multiple spectral and temporal resolutions in the human auditory cortex. PLoS Computational Biology 10: e1003412. http://dx.doi.org/10.1371/journal.pcbi.1003412

1/07/2014

Across-cultures replicable bodily maps of experienced emotions

In linguistic expressions, emotional experiences are often described as bodily sensations, such as someone “having cold feet” or “heartache” that can be surprisingly similar across different cultures and languages. Furthermore, in cognitive neuroscience theories of emotions, somatosensory feedback has been proposed to support conscious emotional experiences. On the other hand, there are classical findings indicating that it is difficult to classify emotional states (other than changes in the level of arousal) based on measures of autonomic nervous system activity. Somewhat surprisingly, the question of whether emotional experiences during different emotional states (e.g., anger, sadness, happiness) are associated with distinct patterns of bodily sensations has not been addressed empirically.

In their recent study, Nummenmaa et al. (2013) conducted a series of five closely related experiments where a total of 701 participants were presented outlines of bodies along with emotional stimuli of different types and were asked to color bodily regions in the outlines where they felt increasing or decreasing activity while experiencing different kinds of emotions. The authors observed that different emotions were associated with across-stimulus-type replicable patterns of bodily sensations as indicated by the coloring patterns. These patterns of bodily sensations further replicated across Finnish and Swedish speaking subjects, as well as Taiwanese subjects tested in a separate control study. Based on these findings, the authors propose that emotions are represented in the somatosensory system as culturally universal categorical somatotopic maps that contribute to conscious emotional experiences.


Reference: Nummenmaa L, Glerean E, Hari R, Hietanen JK. Bodily maps of emotions. Proceedings of the National Academy of Sciences USA (2013) e-publication ahead of print. http://dx.doi.org/10.1073/pnas.1321664111