Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Meniscoplasty is a common surgical procedure used to treat meniscus tears. During the operation, there are often key challenges such as a limited visual field, a narrow operating space, and difficulties in controlling the resection range. Therefore, this study developed an arthroscopic robotic system with the ability of autonomous meniscus resection to achieve better surgical outcomes. To address the issue of limited visual fields during the operation, this study used the preoperative and intraoperative meniscus point cloud images for surgical navigation and proposed a novel cross-modal point cloud registration framework. After the registration was completed, the robotic system automatically generated a resection path that could maintain the crescent shape of the remaining meniscus based on the improved Rapidly Exploring Random Tree (RRT) path-planning algorithm in this study. Meanwhile, the Remote Center of Motion (RCM) constraint was introduced during the movement of the robot to enhance safety. In this study, the mean squared error of the preoperative-intraoperative meniscus point cloud registration was only 0.1964 mm, which meets the surgical accuracy requirements. We conducted experiments to validate the autonomous operation capabilities of the robot. The robot successfully completed motion-planning and autonomous implementation, thus demonstrating the reliability of the robotic system.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12109240PMC
http://dx.doi.org/10.3390/bioengineering12050539DOI Listing

Publication Analysis

Top Keywords

robotic system
16
point cloud
12
arthroscopic robotic
8
autonomous operation
8
limited visual
8
meniscus point
8
cloud registration
8
meniscus
5
system
4
system meniscoplasty
4

Similar Publications

This study was conducted to investigate the techniques and complications of enlarged uterine extraction during minimally invasive surgery for uterine malignancy. The electronic medical record was queried for patients with uterine malignancy and enlarged uterus (≥ 250 g) who underwent primary hysterectomy with laparoscopic or robotic approach. Statistical analysis was performed using Fisher's exact test for categorical variables and Kruskal-Wallis test for continuous variables.

View Article and Find Full Text PDF

A soft micron accuracy robot design and clinical validation for retinal surgery.

Microsyst Nanoeng

September 2025

Department of Ophthalmology, Key Laboratory of Precision Medicine for Eye Diseases of Zhejiang Province, Center for Rehabilitation Medicine,, Zhejiang Provincial People's Hospital (Affiliated People's Hospital, Hangzhou Medical College), Hangzhou, 314408, China.

Retinal surgery is one of the most delicate and complex operations, which is close to or even beyond the physiological limitation of the human hand. Robots have demonstrated the ability to filter hand tremors and motion scaling which has a promising output in microsurgery. Here, we present a novel soft micron accuracy robot (SMAR) for retinal surgery and achieve a more precise and safer operation.

View Article and Find Full Text PDF

For space missions such as extraterrestrial sample collection, robotic rover exploration, and astronaut landings, the complex terrain and diverse gravitational environments make ground-based micro-low-gravity experimental systems essential for testing and validating spacecraft performance as well as supporting astronaut training. The suspended gravity unloading (SGO) system is a key device commonly used to simulate micro-low-gravity environments. However, the SGO system faces challenges due to model uncertainty and external disturbances, which limit improvements in control accuracy.

View Article and Find Full Text PDF

Single camera estimation of microswimmer depth with a convolutional network.

J R Soc Interface

September 2025

Institute of Intelligent Systems and Robotics, Sorbonne Université, Paris, Île-de-France, France.

A number of techniques have been developed to measure the three-dimensional trajectories of protists, which require special experimental set-ups, such as a pair of orthogonal cameras. On the other hand, machine learning techniques have been used to estimate the vertical position of spherical particles from the defocus pattern, but they require the acquisition of a labelled dataset with finely spaced vertical positions. Here, we describe a simple way to make a dataset of images labelled with vertical position from a single 5 min movie, based on a tilted slide set-up.

View Article and Find Full Text PDF