98%
921
2 minutes
20
Writing over a century ago, Darwin hypothesized that vocal expression of emotion dates back to our earliest terrestrial ancestors. If this hypothesis is true, we should expect to find cross-species acoustic universals in emotional vocalizations. Studies suggest that acoustic attributes of aroused vocalizations are shared across many mammalian species, and that humans can use these attributes to infer emotional content. But do these acoustic attributes extend to non-mammalian vertebrates? In this study, we asked human participants to judge the emotional content of vocalizations of nine vertebrate species representing three different biological classes-Amphibia, Reptilia (non-aves and aves) and Mammalia. We found that humans are able to identify higher levels of arousal in vocalizations across all species. This result was consistent across different language groups (English, German and Mandarin native speakers), suggesting that this ability is biologically rooted in humans. Our findings indicate that humans use multiple acoustic parameters to infer relative arousal in vocalizations for each species, but mainly rely on fundamental frequency and spectral centre of gravity to identify higher arousal vocalizations across species. These results suggest that fundamental mechanisms of vocal emotional expression are shared among vertebrates and could represent a homologous signalling system.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5543225 | PMC |
http://dx.doi.org/10.1098/rspb.2017.0990 | DOI Listing |
Trends Hear
August 2025
Department of Otorhinolaryngology/Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands.
Emotions can be communicated through visual and dynamic characteristics such as smiles and gestures, but also through auditory channels such as laughter, music, and human speech. Pupil dilation has become a notable marker for visual emotion processing; however the pupil's sensitivity to emotional sounds, specifically speech, remains largely underexplored. This study investigated the processing of emotional pseudospeech, which are speech-like sentences devoid of semantic content.
View Article and Find Full Text PDFBehav Brain Res
August 2025
Psychology and Neuroscience, Stanford University, United States. Electronic address:
Over a quarter of a century later, most rodent researchers know that specific types of rat Ultrasonic Vocalizations (USVs) appear to index distinct affective states endowed with arousal, value, and motivational force. Few know the story, however, of how we accidentally stumbled upon 50 kilohertz (50 kHz) USVs in the context of rat play by turning the wrong dial on a bat detector, which I recollect here. The tale of that mistake highlights the critical roles of serendipity, preparation, openness, persistence, and a supportive environment in scientific discovery.
View Article and Find Full Text PDFConscious Cogn
September 2025
Department of Education, University of Helsinki, Finland.
While reading narrative texts, readers' attention often fluctuates from the text (e.g., immersion) to text-unrelated thoughts (e.
View Article and Find Full Text PDFPsychother Res
July 2025
Department of Psychology, Bar-Ilan University, Ramat-Gan, Israel.
Objective: Major depressive disorder (MDD) is associated with ineffective affect regulation. Vocal data can shed light on communication and expression during psychotherapy and provide high-resolution data for the study of affective arousal dynamics. Computerized vocal analyses were used to examine the extent to which intrapersonal and interpersonal vocal-arousal dynamics were linked to session outcomes and whether a session's dampening as compared to an amplification arousal trajectory would moderate this association.
View Article and Find Full Text PDFCommun Biol
July 2025
MoMiLab Research Unit, IMT School for Advanced Studies Lucca, Lucca, Italy.
Sleep is characterized by relative disconnection from the external environment and prompt reversibility in response to salient stimuli. During non-rapid eye movement (NREM) sleep, reactive electroencephalographic (EEG) slow waves (K-complexes, KC) are thought to both suppress the processing of external stimuli and open 'sentinel' windows during which further relevant inputs may be tracked. However, the extent to which a stimulus's relevance modulates the KC-related response remains unclear.
View Article and Find Full Text PDF