Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Sign language (SL) is the linguistics of speech and hearing-impaired individuals. The hand gesture is the primary model employed in SL by speech and hearing-challenged people to talk with themselves and ordinary persons. At present, hand gesture detection plays a vital part, and it is commonly employed in numerous applications worldwide. Hand gesture detection systems can aid in transmission between machines and humans by aiding these sets of people. Machine learning (ML) is a subdivision of artificial intelligence (AI), which concentrates on the growth of a method. The main challenge in hand gesture detection is that machines do not directly understand human language. A standard medium is required to facilitate communication between humans and machines. Hand gesture recognition (GR) serves as this medium, enabling commands for computer interaction that specifically benefit hearing-impaired and elderly individuals. This study proposes a Gesture Recognition for Hearing Impaired People Using an Ensemble of Deep Learning Models with Improving Beluga Whale Optimization (GRHIP-EDLIBWO) model. The main intention of the GRHIP-EDLIBWO model framework for GR is to assist as a valuable tool for developing accessible communication systems for hearing-impaired individuals. To accomplish that, the GRHIP-EDLIBWO method initially performs image preprocessing using a Sobel filter (SF) to enhance edge detection and extract critical gesture features. For the feature extraction process, the squeeze-and-excitation capsule network (SE-CapsNet) effectively captures spatial hierarchies and complex relationships within gesture patterns. In addition, an ensemble of classification processes, such as bidirectional gated recurrent unit (BiGRU), Variational Autoencoder (VAE), and bidirectional long short-term memory (BiLSTM) technique, is employed. Finally, the improved beluga whale optimization (IBWO) method is implemented for the hyperparameter tuning of the three ensemble models. To achieve a robust classification result with the GRHIP-EDLIBWO approach, extensive simulations are conducted on an Indian SL (ISL) dataset. The performance validation of the GRHIP-EDLIBWO approach portrayed a superior accuracy value of 98.72% over existing models.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12215471PMC
http://dx.doi.org/10.1038/s41598-025-06680-9DOI Listing

Publication Analysis

Top Keywords

hand gesture
20
gesture recognition
12
beluga whale
12
gesture detection
12
gesture
9
recognition hearing
8
hearing impaired
8
impaired people
8
people ensemble
8
ensemble deep
8

Similar Publications

Tool use is a complex motor planning problem. Prior research suggests that planning to use tools involves resolving competition between different tool-related action representations. We therefore reasoned that competition may also be exacerbated with tools for which the motions of the tool and the hand are incongruent (e.

View Article and Find Full Text PDF

Deliberate synchronization of speech and gesture: Effects of neurodiversity and development.

Lang Cogn

December 2024

Donders Center for Brain, Cognition, and Behaviour, Radboud University Nijmegen, Heyendaalseweg 135, 6525 AJ Nijmegen, The Netherlands.

The production of speech and gesture is exquisitely temporally coordinated. In autistic individuals, speech-gesture synchrony during spontaneous discourse is disrupted. To evaluate whether this asynchrony reflects motor coordination versus language production processes, the current study examined performed hand movements during speech in youth with autism spectrum disorder (ASD) compared to neurotypical youth.

View Article and Find Full Text PDF

Speech is the primary form of communication; still, there are people whose hearing or speaking skills are disabled. Communication offers an essential hurdle for people with such an impairment. Sign Languages (SLs) are the natural languages of the Deaf and their primary means of communication.

View Article and Find Full Text PDF

Gesture encoding in human left precentral gyrus neuronal ensembles.

Commun Biol

August 2025

Robert J. and Nancy D. Carney Institute for Brain Science, Brown University, Providence, RI, USA.

Understanding the cortical activity patterns driving dexterous upper limb motion has the potential to benefit a broad clinical population living with limited mobility through the development of novel brain-computer interface (BCI) technology. The present study examines the activity of ensembles of motor cortical neurons recorded using microelectrode arrays in the dominant hemisphere of two BrainGate clinical trial participants with cervical spinal cord injury as they attempted to perform a set of 48 different hand gestures. Although each participant displayed a unique organization of their respective neural latent spaces, it was possible to achieve classification accuracies of ~70% for all 48 gestures (and ~90% for sets of 10).

View Article and Find Full Text PDF

This study presents a real-time hand tracking and collision detection system for immersive mixed-reality boxing training on Apple Vision Pro (Apple Inc., Cupertino, CA, USA). Leveraging the device's advanced spatial computing capabilities, this research addresses the limitations of traditional fitness applications that lack precision for technique-based sports like boxing with visual-only hand tracking.

View Article and Find Full Text PDF