Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
Older adults perform worse than young at identifying standard emotional expressions from facial photographs. However, it has been argued that age differences may be reduced when more realistic tasks of emotion identification are used. In two studies, we tested the role of context and multimodal presentation in age differences in emotion perception. Younger and older adults performed tasks of labeling complex emotions from dynamic stimuli. In Study 1, older adults were worse than young at recognizing emotions from silent decontextualized video clips of facial expressions but better than young in a contextually rich multimodal film task. In Study 2, there was no age advantage in emotion perception in the contextually rich task when auditory cues were removed (the video clips were presented only in the visual modality). There was also evidence across both studies that crystallized ability might be more important in decoding contextualized emotions as compared to working memory capacity. Overall, these results indicate that multimodal context particularly benefits older adults' ability to decode others' emotions.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1080/13825585.2025.2548224 | DOI Listing |