Multisensory integration effects caused by cross-modal mental imagery

The existence of perceptually robust multisensory interactions, such as the ventriloquism and McGurk effects, has been well established in behavioral studies, and neuroimaging studies have further shown that multisensory processing of stimuli takes place even in primary sensory cortical areas. There is also evidence suggesting that mental imagery, such as an imagined sound of a hammer seen to hit an anvil in a silent movie, modulates processing in sensory cortical areas. What has remained less explored is the extent that imaginary visual stimuli influence processing of real auditory stimuli and vice versa.

In their recently published study, Berger and Ehrsson (2013) conducted a series of behavioral experiments where they tested whether imagined stimuli cause well-known multisensory illusory percepts similarly as real stimuli. They first tested the effects of an imagined sound of collision on cross-bounce illusion, followed by testing the effects of an imagined visual stimulus on ventriloquism effect, and as a third test, the effects of an imagined seen articulation on the so-called McGurk effect. In all three experiments, the authors were able to demonstrate that imagined stimulus causes similar multisensory illusions as real cross-modal stimuli; imagining the sound of a collision gave rise to the cross-bounce illusion, imagining a visual stimulus shifted the perceived location of an auditory stimulus, and auditory imagery of speech stimuli led to a promotion of an illusory speech percept, i.e., in a modified McGurk illusion.

These highly exciting results nicely expand previous findings on multisensory interactions, and provide further evidence for the view that sensory cortices play a pivotal role in generation of mental imagery – even to the extent that visual imagery modulate processing of auditory stimuli and vice versa. It is easy to see that these behavioral results also provide an excellent starting point for further neuroimaging studies investigating the multisensory effects of mental imagery in sensory cortical areas of the brain.

Reference: Berger CC, Ehrsson HH. Mental imagery changes multisensory perception. Current Biology (2013), e-publication ahead of print.


Task-specific networks of the brain revealed by a meta-analysis of more than 1600 neuroimaging studies

Neuroinformatics refers to free sharing of analysis tools and experimental data. When neuroinformatics was taking its very first steps, there were, among extensive support, also voices of critic, mostly doubting the usefulness of making published neuroimaging datasets available, whether anyone could in practice utilize data that is most often collected to answer some highly specific research question. With gigantic advances in computational power, it has become possible to put together thousands of such datasets and look for consistent patterns in the resulting big data with sophisticated data analysis algorithms that have been adapted from, e.g., statistical physics. Recently, there have been studies combining data over a large number of resting-state imaging studies to inspect the brain as a complex network. Consistent patterns of functional connectivity (or rather “co-activation”, where a number of brain areas tend to change their level of activity hand-in-hand) have indeed been observed, but it has been less well known how active engagement in various types of tasks changes the “resting state” networks of co-activity.

In their recent study, Crossley et al. (2013) included in a meta-analysis data from more than 1600 functional magnetic resonance imaging and positron emission tomography studies published between 1985–2010 to inspect the network activity patterns of the human brain when experimental subjects are engaging in different types of tasks, including perception, action, executive tasks, and during emotions.  Based on this meta-analysis, the authors were able to observe that there are large similarities in functional networks of the brain between resting state (where the task of the subjects typically has been to lay in the scanner and either do nothing or focus on a fixation cross) and active tasks, however, differences also emerged. It was observed that so-called occipital module was mostly activated during perception, central module during action, the default-mode module by emotions, and fronto-parietal module by executive tasks. Further, the authors observed that there were important nodes in parietal and prefrontal cortex that often connected over long distances and were involved in diverse range of tasks. Deactivation of nodes was also noted to play an important role in flexible network reconfiguration with changing cognitive demands. Overall, this study is a prime example of the usefulness of big data in cognitive neuroscience by allowing sophisticated analysis of the brain’s central processing principles that will likely pave way for further research efforts in a highly significant manner.

Reference: Crossley NA, Mechelli A, VĂ©rtes PE, Winton-Brown TT, Patel AX, Ginestet CE, McGuire P, Bullmore ET. Cognitive relevance of the community structure of the human brain functional coactivation network. Proc. Natl. Acad. Sci. USA (2013) e-publication ahead of print. http://dx.doi.org/10.1073/pnas.1220826110