98%
921
2 minutes
20
Reward-predictive items capture attention even when task-irrelevant. While value-driven attention typically generalizes to stimuli sharing critical reward-associated features (e.g., red), recent evidence suggests an alternative generalization mechanism based on feature relationships (e.g., redder). Here, we investigated whether relational coding of reward-associated features operates across different learning contexts by manipulating search mode and target-distractor similarity. Results showed that singleton search training induced value-driven relational attention regardless of target-distractor similarity (Experiments 1a-1b). In contrast, feature search training produced value-driven relational attention only when targets and distractors were dissimilar, but not when they were similar (Experiments 2a-2c). These findings indicate that coarse selection training (singleton search or feature search among dissimilar items) promotes relational coding of reward-associated features, while fine selection (feature search among similar items) engages precise feature coding. The precision of target selection during reward learning thus critically determines value-driven attentional mechanisms.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12311181 | PMC |
http://dx.doi.org/10.1038/s41539-025-00342-1 | DOI Listing |
NPJ Sci Learn
July 2025
Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, China.
Reward-predictive items capture attention even when task-irrelevant. While value-driven attention typically generalizes to stimuli sharing critical reward-associated features (e.g.
View Article and Find Full Text PDFActa Psychol (Amst)
August 2025
Behavioral Epidemiology, Institute of Clinical Psychology and Psychotherapy, Technische Universität Dresden, Germany; General and Experimental Psychology, Department of Psychology, Ludwig-Maximilians-Universität München, Munich, Germany. Electronic address:
Human visual attention is strongly influenced by rewards, affecting both top-down and bottom-up attentional processes. The value-driven attentional capture (VDAC) paradigm, introduced by Anderson et al. (2011b), has had a significant impact on the field of visual attention.
View Article and Find Full Text PDFNat Neurosci
June 2025
Department of Biological Sciences, Purdue University, West Lafayette, IN, USA.
Sensory perception requires the processing of stimuli from both sides of the body. Yet, how neurons bind stimulus information across the hemispheres to create a unified percept remains unknown. Here we perform large-scale recordings from neurons in the left and right primary somatosensory cortex (S1) in mice performing a task requiring active whisker touch to coordinate stimulus features across hemispheres.
View Article and Find Full Text PDFPsychon Bull Rev
April 2025
Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, China.
Attention is rapidly directed to stimuli associated with rewards in past experience, independent of current task goals and physical salience of stimuli. However, despite the robust attentional priority given to reward-associated features, studies often indicate negligible priority toward previously rewarded locations. Here, we propose a relational account of value-driven attention, a mechanism that relies on spatial relationship between items to achieve value-guided selections.
View Article and Find Full Text PDFJ Exp Psychol Hum Percept Perform
March 2024
Centre for Human Brain Health, University of Birmingham.
Humans use selective attention to prioritize visual features, like color or shape, as well as discrete spatial locations, and these effects are sensitive to the experience of reward. Reward-associated features and locations are accordingly prioritized from early in the visual hierarchy. Attention is also sensitive to the establishment of visual objects: selection of one constituent object part often leads to prioritization of other locations on that object.
View Article and Find Full Text PDF