Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Emotion recognition via EEG signals and facial analysis becomes one of the key aspects of human-computer interaction and affective computing, enabling scientists to gain insight into the behavior of humans. Classic emotion recognition methods usually rely on controlled stimuli, such as music and images, which does not allow for ecological validity and scope. This paper proposes the EmoTrans model, which uses the DEAP dataset to analyze physiological signals and facial video recordings. It consisted of EEG recordings from 32 viewers of 40 one-minute movie clips and facial videos from 22 participants, which can be used to analyze the emotional states based on variables; valence, arousal, dominance, and liking. To increase the model's validity, expert validation in the form of a survey by psychologists was conducted. The model integrates features extracted from EEG signals in the time, frequency, and wavelet domains as well as facial video data to provide a comprehensive understanding of emotional states. Our proposed EmoTrans architecture achieves the accuracies of 89.3%, 87.8%, 88.9%, and 89.1% for arousal, valence, dominance, and liking, respectively. EmoTrans achieved an impressive classification accuracy of 89% for emotions such as happiness, excitement, calmness, and distress, among others. Moreover, the Statistical significance of performance improvements was confirmed using a paired t-test, which showed that EmoTrans significantly outperforms baseline models. The model was validated with machine learning and deep learning classifiers and also by Leave-one-subject-out cross-validation (LOSO-CV). The proposed attention-based architecture effectively prioritizes the most relevant features from EEG and facial data, pushing the boundaries of emotion classification and offering a more nuanced understanding of human emotions across different states.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12216904PMC
http://dx.doi.org/10.1038/s41598-025-98404-2DOI Listing

Publication Analysis

Top Keywords

emotion recognition
12
eeg signals
12
signals facial
12
recognition eeg
8
facial analysis
8
expert validation
8
facial video
8
emotional states
8
dominance liking
8
facial
6

Similar Publications

Beyond the methodological binary: coproduction as the third pillar of mental health science.

BMJ Ment Health

September 2025

Independent Researcher, Cardiff, Cardiff, UK

Background: Mental health research has long been structured around qualitative and quantitative methodologies, often marginalising experiential knowledge and reinforcing hierarchies of expertise. Although coproduction has gained traction as a participatory approach, its methodological status remains contested, leading to inconsistent practices and risks of tokenism.

Objective: This paper explores whether coproduction should be recognised not merely as a participatory ideal but as a third methodological pillar in mental health research, with distinct philosophical, ethical and practical foundations.

View Article and Find Full Text PDF

Choral harmony: the role of collective singing in ritual, cultural identity and cognitive-affective synchronisation in the age of AI.

Disabil Rehabil Assist Technol

September 2025

School of Drama, Film and Television, Shenyang Conservatory of Music, Shenyang, China.

This study examines how choral singing functions as a mechanism for sustaining ritual practice and reinforcing cultural identity. By integrating perspectives from musicology, social psychology, and cognitive science, it explores how collective vocal performance supports emotional attunement, group cohesion, and symbolic memory in culturally diverse contexts. A mixed-methods approach was applied, combining ethnographic observation, survey-based data, and cognitive measures with AI-informed frameworks such as voice emotion recognition and neural synchrony modeling.

View Article and Find Full Text PDF

The effect of face masks on confusion of emotional expressions.

PLoS One

September 2025

Department of Psychology & Sociology, Texas A&M University - Corpus Christi, Corpus Christi, Texas, United States of America.

While the use of personal protective equipment protects healthcare workers against transmissible disease, it also obscures the lower facial regions that are vital for transmitting emotion signals. Previous studies have found that face coverings can impair recognition of emotional expressions, particularly those that rely on signals from the lower regions of the face, such as disgust. Recent research on the individual differences that may influence expression recognition, such as emotional intelligence, has shown mixed results.

View Article and Find Full Text PDF

Brain ischemia is a major global cause of disability, frequently leading to psychoneurological issues. This study investigates the effects of 4-aminopyridine (4-AP) on anxiety, cognitive impairment, and potential underlying mechanisms in a mouse model of medial prefrontal cortex (mPFC) ischemia. Mice with mPFC ischemia were treated with normal saline (NS) or different doses of 4-AP (250, 500, and 1000 µg/kg) for 14 consecutive days.

View Article and Find Full Text PDF

As in all other traumas, children and adolescents are more sensitive and vulnerable to the effects of earthquakes. This study aimed to understand the earthquake experiences of adolescent survivors. This study is a qualitative study in which the photovoice method was used.

View Article and Find Full Text PDF