Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Background: Impairment of higher language functions associated with natural spontaneous speech in multiple sclerosis (MS) remains underexplored.

Objectives: We presented a fully automated method for discriminating MS patients from healthy controls based on lexical and syntactic linguistic features.

Methods: We enrolled 120 MS individuals with Expanded Disability Status Scale ranging from 1 to 6.5 and 120 age-, sex-, and education-matched healthy controls. Linguistic analysis was performed with fully automated methods based on automatic speech recognition and natural language processing techniques using eight lexical and syntactic features acquired from the spontaneous discourse. Fully automated annotations were compared with human annotations.

Results: Compared with healthy controls, lexical impairment in MS consisted of an increase in content words ( = 0.037), a decrease in function words ( = 0.007), and overuse of verbs at the expense of noun ( = 0.047), while syntactic impairment manifested as shorter utterance length ( = 0.002), and low number of coordinate clause ( < 0.001). A fully automated language analysis approach enabled discrimination between MS and controls with an area under the curve of 0.70. A significant relationship was detected between shorter utterance length and lower symbol digit modalities test score ( = 0.25,  = 0.008). Strong associations between a majority of automatically and manually computed features were observed ( > 0.88,  < 0.001).

Conclusion: Automated discourse analysis has the potential to provide an easy-to-implement and low-cost language-based biomarker of cognitive decline in MS for future clinical trials.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10293520PMC
http://dx.doi.org/10.1177/17562864231180719DOI Listing

Publication Analysis

Top Keywords

lexical syntactic
12
fully automated
12
healthy controls
12
natural language
8
language processing
8
multiple sclerosis
8
lexical
4
syntactic deficits
4
deficits analyzed
4
automated
4

Similar Publications

Purpose: Speech disfluencies are common in individuals who do not stutter, with estimates suggesting a typical rate of six per 100 words. Factors such as language ability, processing load, planning difficulty, and communication strategy influence disfluency. Recent work has indicated that bilinguals may produce more disfluencies than monolinguals, but the factors underlying disfluency in bilingual children are poorly understood.

View Article and Find Full Text PDF

Large language models (LLMs) can emulate many aspects of human cognition and have been heralded as a potential paradigm shift. They are proficient in chat-based conversation, but little is known about their ability to simulate spoken conversation. We investigated whether LLMs can simulate spoken human conversation.

View Article and Find Full Text PDF

Narrative skills involve retelling or generating stories, reflecting cognitive and communication development. This use of language is decontextualized and requires a fluent interplay of various components. Autistic children often demonstrate atypical language development and restricted communication tailored to specific needs.

View Article and Find Full Text PDF

Neuro-cognitive development of semantic and syntactic bootstrapping in 7- to 9-year-old children.

Cortex

July 2025

Department of Psychology and Human Development, Peabody College, Vanderbilt University, Nashville, TN, USA. Electronic address:

We examined longitudinal relations of brain and behavior assessing semantic and syntactic language bootstrapping in children from ages 7- to 10.5-years-old. This study is a direct extension of our earlier investigation on 5- to -7-year-old children (Wagley & Booth, 2021).

View Article and Find Full Text PDF

The human brain must add information to the acoustic speech signal in order to understand language. Many accounts propose that the prosodic structure of utterances (including their syllabic rhythm and speech melody), in combination with stored lexical knowledge, cue and interact with higher order abstract semantic and syntactic information. While cortical rhythms, particularly in the delta and theta band, synchronize to quasi-rhythmic low-level acoustic speech features, it remains unclear how the human brain encodes abstract speech properties in neural rhythms in the absence of an acoustic signal, i.

View Article and Find Full Text PDF