Neural Correlates of Optimal Multisensory Decision Making under Time-Varying Reliabilities with an Invariant Linear Probabilistic Population Code.

Neuron

Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai Center for Brain Science and Brain-Inspired Intelligence Technology, 200031 Shanghai, China; University of Chinese Academy

Published: December 2019


Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Perceptual decisions are often based on multiple sensory inputs whose reliabilities rapidly vary over time, yet little is known about how the brain integrates these inputs to optimize behavior. The optimal solution requires that neurons simply add their sensory inputs across time and modalities, as long as these inputs are encoded with an invariant linear probabilistic population code (ilPPC). While this theoretical possibility has been raised before, it has never been tested experimentally. Here, we report that neural activities in the lateral intraparietal area (LIP) of macaques performing a vestibular-visual multisensory decision-making task are indeed consistent with the ilPPC theory. More specifically, we found that LIP accumulates momentary evidence proportional to the visual speed and the absolute value of vestibular acceleration, two variables that are encoded with close approximations to ilPPCs in sensory areas. Together, these results provide a remarkably simple and biologically plausible solution to near-optimal multisensory decision making.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neuron.2019.08.038DOI Listing

Publication Analysis

Top Keywords

multisensory decision
8
decision making
8
invariant linear
8
linear probabilistic
8
probabilistic population
8
population code
8
sensory inputs
8
neural correlates
4
correlates optimal
4
optimal multisensory
4

Similar Publications

Prior researches on global-local processing have focused on hierarchical objects in the visual modality, while the real-world involves multisensory interactions. The present study investigated whether the simultaneous presentation of auditory stimuli influences the recognition of visually hierarchical objects. We added four types of auditory stimuli to the traditional visual hierarchical letters paradigm:no sound (visual-only), a pure tone, a spoken letter that was congruent with the required response (response-congruent), or a spoken letter that was incongruent with it (response-incongruent).

View Article and Find Full Text PDF

Multisensory processing has long been recognized to enhance perception, cognition, and actions in adults. However, there is currently limited understanding of how multisensory stimuli, in comparison to unisensory stimuli, contribute to the development of both motor and verbally assessed working memory (WM) in children. Thus, the current study aimed to systematically review and meta-analyze the associations between the multisensory processing of auditory and visual stimuli, and performance on simple and more complex WM tasks, in children from birth to 15 years old.

View Article and Find Full Text PDF

EXPRESS: When and how visual aesthetic features influence approach-avoidance motivated behavior.

Q J Exp Psychol (Hove)

August 2025

Centre for Multisensory Marketing, Department of Marketing, BI Norwegian Business School, Oslo, Norway, Nydalsveien 37, 0484 Oslo, Norway.

For decades, researchers have explored the relationship between aesthetic features such as symmetry and complexity and preference. Likewise, philosophers and psychologists alike have pondered the differences between preference and behavior. Nevertheless, little is known about the relationship between aesthetic preference and motivation.

View Article and Find Full Text PDF

This paper explores the impact of technology-mediated (TM) communication on interpersonal synchrony through the integrated lens of social neuroscience, embodied cognition, and Conceptual Metaphor Theory (CMT). It focuses particularly on the case of remote working, which exemplifies the challenges and adaptations required when social interactions shift from face-to-face (FTF) to digital environments. While FTF communication enables interpersonal synchrony through rich sensorimotor cues, such as gaze, posture, and gesture, TM communication often reduces or distorts these embodied signals.

View Article and Find Full Text PDF

Distinct audio and visual accumulators co-activate motor preparation for multisensory detection.

Nat Hum Behav

August 2025

School of Electrical and Electronic Engineering and UCD Centre for Biomedical Engineering, University College Dublin, Dublin, Ireland.

Detecting targets in multisensory environments is an elemental brain function, but it is not yet known whether information from different sensory modalities is accumulated by distinct processes, and, if so, whether the processes are subject to separate decision criteria. Here we address this in two experiments (n = 22, n = 21) using a paradigm design that enables neural evidence accumulation to be traced through a centro-parietal positivity and modelled alongside response time distributions. Through analysis of both redundant (respond-to-either-modality) and conjunctive (respond-only-to-both) audio-visual detection data, joint neural-behavioural modelling, and a follow-up onset-asynchrony experiment, we found that auditory and visual evidence is accumulated in distinct processes during multisensory detection, and cumulative evidence in the two modalities sub-additively co-activates a single, thresholded motor process during redundant detection.

View Article and Find Full Text PDF