98%
921
2 minutes
20
Background: The development of automatic emotion recognition models from smartphone videos is a crucial step toward the dissemination of psychotherapeutic app interventions that encourage emotional expressions. Existing models focus mainly on the 6 basic emotions while neglecting other therapeutically relevant emotions. To support this research, we introduce the novel Stress Reduction Training Through the Recognition of Emotions Wizard-of-Oz (STREs WoZ) dataset, which contains facial videos of 16 distinct, therapeutically relevant emotions.
Objective: This study aimed to develop deep learning-based automatic facial emotion recognition (FER) models for binary (positive vs negative) and multiclass emotion classification tasks, assess the models' performance, and validate them by comparing the models with human observers.
Methods: The STREs WoZ dataset contains 14,412 facial videos of 63 individuals displaying the 16 emotions. The selfie-style videos were recorded during a stress reduction training using front-facing smartphone cameras in a nonconstrained laboratory setting. Automatic FER models using both appearance and deep-learned features for binary and multiclass emotion classification were trained on the STREs WoZ dataset. The appearance features were based on the Facial Action Coding System and extracted with OpenFace. The deep-learned features were obtained through a ResNet50 model. For our deep learning models, we used the appearance features, the deep-learned features, and their concatenation as inputs. We used 3 recurrent neural network (RNN)-based architectures: RNN-convolution, RNN-attention, and RNN-average networks. For validation, 3 human observers were also trained in binary and multiclass emotion recognition. A test set of 3018 facial emotion videos of the 16 emotions was completed by both the automatic FER model and human observers. The performance was assessed with unweighted average recall (UAR) and accuracy.
Results: Models using appearance features outperformed those using deep-learned features, as well as models combining both feature types in both tasks, with the attention network using appearance features emerging as the best-performing model. The attention network achieved a UAR of 92.9% in the binary classification task, and accuracy values ranged from 59.0% to 90.0% in the multiclass classification task. Human performance was comparable to that of the automatic FER model in the binary classification task, with a UAR of 91.0%, and superior in the multiclass classification task, with accuracy values ranging from 87.4% to 99.8%.
Conclusions: Future studies are needed to enhance the performance of automatic FER models for practical use in psychotherapeutic apps. Nevertheless, this study represents an important first step toward advancing emotion-focused psychotherapeutic interventions via smartphone apps.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12268218 | PMC |
http://dx.doi.org/10.2196/68942 | DOI Listing |
PLoS One
September 2025
Department of Psychology, University of Duisburg-Essen, Essen, Germany.
The susceptibility to emotional contagion has been psychometrically addressed by the self-reported Emotional Contagion Scale. With the present research, we validated a German adaptation of this scale and developed a mimicry brief version by selecting only the four items explicitly addressing the overt subprocess of mimicry. Across three studies (N1 = 195, N2 = 442, N3 = 180), involving various external measures of empathy, general personality domains, emotion recognition, and other constructs, the total German Emotional Contagion Scale demonstrated sound convergent and discriminant validity.
View Article and Find Full Text PDFRheumatology (Oxford)
September 2025
Department of Rheumatology & Clinical Immunology, University Medical Centre Utrecht, Utrecht, The Netherlands.
Objectives: Many patients with systemic sclerosis (SSc) experience impaired hand function, yet the precise nature and impact of this impairment remains unclear. In this study, we explored the determinants of hand function impairment in SSc from a patient perspective and its impact on daily life. Additionally, we identified unmet care needs related to hand function impairment.
View Article and Find Full Text PDFProg Mol Biol Transl Sci
September 2025
Nanobiology and Nanozymology Research Laboratory, National Institute of Animal Biotechnology (NIAB), Opposite Journalist Colony, Near Gowlidoddy, Hyderabad, Telangana, India; Regional Centre for Biotechnology (RCB), Faridabad, Haryana, India. Electronic address:
Biosensors are rapidly emerging as a key tool in animal health management, therefore, gaining a significant recognition in the global market. Wearable sensors, integrated with advanced biosensing technologies, provide highly specialized devices for measuring both individual and multiple physiological parameters of animals, as well as monitoring their environment. These sensors are not only precise and sensitive but also reliable, user-friendly, and capable of accelerating the monitoring process.
View Article and Find Full Text PDFRev Esc Enferm USP
September 2025
Universidade Federal do Triângulo Mineiro, Uberaba, MG, Brazil.
Objective: To evaluate the impact of an educational intervention on nursing care for women with signs of postpartum depression for primary health care nurses.
Method: Quasi-experimental, before-and-after study carried out with 14 primary health care nurses from a municipality, who participated in an educational intervention on nursing care for women with signs of postpartum depression. Qualitative data analysis was carried out before and after the intervention, using Bardin's thematic content analysis.
Ear Hear
September 2025
Department of Otorhinolaryngology, University Medical Center Groningen (UMCG), University of Groningen, Groningen, the Netherlands.
Objectives: Alexithymia is characterized by difficulties in identifying and describing one's own emotions. Alexithymia has previously been associated with deficits in the processing of emotional information at both behavioral and neurobiological levels, and some studies have shown elevated levels of alexithymic traits in adults with hearing loss. This explorative study investigated alexithymia in young and adolescent school-age children with hearing aids in relation to (1) a sample of age-matched children with normal hearing, (2) age, (3) hearing thresholds, and (4) vocal emotion recognition.
View Article and Find Full Text PDF