Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
Although real-life events are multisensory, how audio-visual objects are stored in working memory is an open question. At a perceptual level, evidence shows that both top-down and bottom-up attentional processes can play a role in multisensory interactions. To understand how attention and multisensory processes interact in working memory, we designed an audiovisual delayed match-to-sample task in which participants were presented with one or two audiovisual memory items, followed by an audiovisual probe. In different blocks, participants were instructed to either (a) attend to the auditory features, (b) attend to the visual features, or (c) attend to both auditory and visual features. Participants were instructed to indicate whether the task-relevant features of the probe matched one of the task-relevant feature(s) or objects in working memory. Behavioral results showed interference from task-irrelevant features, suggesting bottom-up integration of audiovisual features and their automatic encoding into working memory, irrespective of task relevance. Yet, event-related potential analyses revealed no evidence for active maintenance of these task-irrelevant features, while they clearly taxed greater attentional resources during recall. Notably, alpha oscillatory activity revealed that linking information between auditory and visual modalities required more attentional demands at retrieval. Overall, these results offer critical insights into how and at which processing stage multisensory interactions occur in working memory.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11843526 | PMC |
http://dx.doi.org/10.1111/psyp.70018 | DOI Listing |