98%
921
2 minutes
20
Cervical cancer (CC) is the fourth most common malignant tumor among women worldwide. Constructing a high-accuracy deep convolutional neural network (DCNN) for cervical cancer screening and diagnosis is important for the successful prevention of cervical cancer. In this work, we proposed a robust DCNN for cervical cancer screening using whole-slide images (WSI) of ThinPrep cytologic test (TCT) slides from 211 cervical cancer and 189 normal patients. We used an active learning strategy to improve the efficiency and accuracy of image labeling. The sensitivity, specificity, and accuracy of the best model were 96.21%, 98.95%, and 97.5% for CC patient identification respectively. Our results also demonstrated that the active learning strategy was superior to the traditional supervised learning strategy in cost reduction and improvement of image labeling quality. The related data and source code are freely available at https://github.com/hqyone/cancer_rcnn.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10034408 | PMC |
http://dx.doi.org/10.3389/fbinf.2023.1101667 | DOI Listing |
Nat Med
September 2025
Emerging Technology, Research Prioritization and Support Unit, Department of Research for Health, World Health Organization, Geneva, Switzerland.
Clinical trials are essential to advancing cancer control, yet access and participation remain unequal globally. The World Health Organization (WHO) established the International Clinical Trials Registry Platform (ICTRP) to enable a complete view of interventional clinical research for all those involved in healthcare decision-making and to identify actionable goals to equitable participation at the global level. A review of 89,069 global cancer clinical trials registered in the WHO ICTRP between 1999 and December 2022 revealed a cancer clinical trial landscape dominated by high-income countries and focused on pharmacological interventions, with multinational collaboration limited to only 3% of recruiting trials.
View Article and Find Full Text PDFWomens Health Issues
September 2025
Tufts University School of Medicine/Tufts Medicine, Boston, Massachusetts. Electronic address:
Background: More than 20% of cervical cancers are diagnosed in women older than 65 years. Guidelines recommend screening exit at age 65 for average-risk patients only if certain criteria are met, yet most women aged 64-66 years in the United States are inadequately screened. In this mixed methods study, we explored clinician knowledge of exit criteria.
View Article and Find Full Text PDFObjectives: Cervical cancer is a serious threat to women's life and health and has a high mortality rate. Colposcopy is an important method for early clinical cervical cancer screening, but the traditional vaginal dilator has problems such as discomfort in use and cumbersome operation. For this reason, this study aims to design an intelligent vaginal dilatation system to automate colposcopy and enhance patient comfort.
View Article and Find Full Text PDFJMIR Res Protoc
September 2025
Moores Cancer Center, University of California, San Diego, La Jolla, CA, United States.
Background: Cancer screening nonadherence persists among adults who are deaf, deafblind, and hard of hearing (DDBHH). These barriers span individual, clinician, and health care system levels, contributing to difficulties understanding cancer information, accessing screening services, and following treatment directives. Critical communication barriers include ineffective patient-physician communication, limited access to American Sign Language (ASL) cancer information, misconceptions about medical procedures, insurance navigation difficulties, and intersectional barriers for multiply marginalized individuals.
View Article and Find Full Text PDFOral Oncol
September 2025
Department of Oral Mucosa, Shanghai Stomatological Hospital & School of Stomatology, Fudan University, Shanghai, China; Shanghai Key Laboratory of Craniomaxillofacial Development and Diseases, Fudan University, Shanghai, China. Electronic address: