98%
921
2 minutes
20
Information obtained from multiple sensory modalities, such as vision and touch, is integrated to yield a holistic percept. As a haptic approach usually involves cross-modal sensory experiences, it is necessary to develop an apparatus that can characterize how a biological system integrates visual-tactile sensory information as well as how a robotic device infers object information emanating from both vision and touch. In the present study, we develop a novel visual-tactile cross-modal integration stimulator that consists of an LED panel to present visual stimuli and a tactile stimulator with three degrees of freedom that can present tactile motion stimuli with arbitrary motion direction, speed, and indentation depth in the skin. The apparatus can present cross-modal stimuli in which the spatial locations of visual and tactile stimulations are perfectly aligned. We presented visual-tactile stimuli in which the visual and tactile directions were either congruent or incongruent, and human observers reported the perceived visual direction of motion. Results showed that perceived direction of visual motion can be biased by the direction of tactile motion when visual signals are weakened. The results also showed that the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs, a fundamental property known for cross-modal integration.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3715219 | PMC |
http://dx.doi.org/10.3390/s130607212 | DOI Listing |
Eur J Neurosci
September 2025
The Tampa Human Neurophysiology Lab, Department of Neurosurgery, Brain and Spine, Morsani College of Medicine, University of South Florida, Tampa, Florida, USA.
Sensory areas exhibit modular selectivity to stimuli, but they can also respond to features outside of their basic modality. Several studies have shown cross-modal plastic modifications between visual and auditory cortices; however, the exact mechanisms of these modifications are yet not completely known. To this aim, we investigated the effect of 12 min of visual versus sound adaptation (referring to forceful application of an optimal/nonoptimal stimulus to a neuron[s] under observation) on the infragranular and supragranular primary visual neurons (V1) of the cat (Felis catus).
View Article and Find Full Text PDFACS Nano
September 2025
International School of Microelectronics, Dongguan University of Technology, Dongguan 523808, China.
Mimicking human brain functionalities with neuromorphic devices represents a pivotal breakthrough in developing bioinspired electronic systems. The human somatosensory system provides critical environmental information and facilitates responses to harmful stimuli, endowing us with good adaptive capabilities. However, current sensing technologies often struggle with insufficient sensitivity, dynamic response, and integration challenges.
View Article and Find Full Text PDFPsychol Aging
August 2025
Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University.
Previous studies have produced inconsistent findings regarding whether age-related declines in emotion perception affect various verbal and nonverbal channels to the same extent and whether they are linked to cognitive ability. This study systematically explored age-related differences in multisensory emotional speech perception and their associations with overall cognitive functioning. Thirty-three older adults (22 females) and 32 young adults (22 females) completed two Stroop-like tests examining the perceptual salience of verbal semantics, vocal prosody, and facial expressions.
View Article and Find Full Text PDFCognition
August 2025
Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat Gan 5290002, Israel.
Perception of a current stimulus is influenced by one's immediately preceding sensory experience. This phenomenon, termed "serial dependence", affects perception of isochronous rhythm. However, it is unknown whether serial dependence affects perception of more complex temporal dynamics, such as changing tempo.
View Article and Find Full Text PDFSci Rep
August 2025
RIKEN Center for Brain Science, 2-1 Hirosawa, Wako, 351-0198, Saitama, Japan.
To navigate the environment and search for hosts, mosquitoes utilize multiple sensory cues including carbon dioxide (CO₂), visual, olfactory, and humidity cues. However, how mosquitoes shape their behavior by integrating these cues is poorly understood. Here we monitored the flight maneuvers of Aedes albopictus in a virtual reality environment where sensory cues were presented in open- or closed-loop.
View Article and Find Full Text PDF