A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 197

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1075
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3195
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 317
Function: require_once

Visual nudging of navigation strategies improves frequency discrimination during auditory-guided locomotion. | LitMetric

Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Perception in natural environments requires integrating multisensory inputs while navigating our surroundings. During locomotion, sensory cues such as vision and audition change coherently, providing crucial environmental information. This integration may affect perceptual thresholds due to sensory interference. Vision often dominates in multimodal contexts, overshadowing auditory information and potentially degrading audition. While traditional laboratory experiments offer controlled insights into sensory integration, they often fail to replicate the dynamic, multisensory interactions of real-world behavior. We used a naturalistic paradigm in which participants navigate an arena searching for a target guided by position-dependent auditory cues. Previous findings showed that frequency discrimination thresholds during self-motion matched those in stationary paradigms, even though participants often relied on visually dominated navigation instead of auditory feedback. This suggested that vision might affect auditory perceptual thresholds in naturalistic settings. Here, we manipulated visual input to examine its effect on frequency discrimination and search strategy selection. By degrading visual input, we nudged participants' attention toward audition, leveraging subtle sensory adjustments to promote adaptive use of auditory cues without restricting their freedom of choice. Thus, this approach explores how attentional shifts influence multisensory integration during self-motion. Our results show that frequency discrimination thresholds improved by restricting visual input, suggesting that reducing visual interference can increase auditory sensitivity. This is consistent with adaptive behavioral theories, suggesting that individuals can dynamically adjust their perceptual strategies to leverage the most reliable sensory inputs. These findings contribute to a better understanding of multisensory integration, highlighting the flexibility of sensory systems in complex environments.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11963732PMC
http://dx.doi.org/10.3389/fnins.2025.1535759DOI Listing

Publication Analysis

Top Keywords

frequency discrimination
16
visual input
12
perceptual thresholds
8
auditory cues
8
discrimination thresholds
8
multisensory integration
8
sensory
6
auditory
6
visual
5
visual nudging
4

Similar Publications