A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 197

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 317
Function: require_once

A Powered Prosthetic Hand With Vision System for Enhancing the Anthropopathic Grasp. | LitMetric

Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

The anthropomorphic grasping capability of prosthetic hands is critical for enhancing user experience and functional efficiency. Existing prosthetic hands relying on brain-computer interfaces (BCI) and electromyography (EMG) face limitations in achieving natural grasping due to insufficient gesture adaptability and intent recognition. While vision systems enhance object perception, they lack dynamic human-like gesture control during grasping. To address these challenges, we propose a vision-powered prosthetic hand system that integrates two innovations. Spatial Geometry-based Gesture Mapping (SG-GM) dynamically models finger joint angles as polynomial functions of hand-object distance, derived from geometric features of human grasping sequences. These functions enable continuous anthropomorphic gesture transitions, mimicking natural hand movements. Motion Trajectory Regression-based Grasping Intent Estimation (MTR-GIE) predicts user intent in multi-object environments by regressing wrist trajectories and spatially segmenting candidate objects. Experiments with eight daily objects demonstrated high anthropomorphism (similarity coefficient ${R}^{{2}}=0.911$ , root mean squared error $\textit {RMSE}=2.47 {^{\circ}}$ ), rapid execution ( $3.07\pm 0.41$ s), and robust success rates (95.43% single-object; 88.75% multi-object). The MTR-GIE achieved 94.35% intent estimation accuracy under varying object spacing. This work pioneers vision-driven dynamic gesture synthesis for prosthetics, eliminating dependency on invasive sensors and advancing real-world usability.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNSRE.2025.3567392DOI Listing

Publication Analysis

Top Keywords

prosthetic hand
8
prosthetic hands
8
intent estimation
8
grasping
5
gesture
5
powered prosthetic
4
hand vision
4
vision system
4
system enhancing
4
enhancing anthropopathic
4

Similar Publications