Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1075
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3195
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
In cancer pathology diagnosis, analyzing Whole Slide Images (WSI) encounters challenges like invalid data, varying tissue features at different magnifications, and numerous hard samples. Multiple Instance Learning (MIL) is a powerful tool for addressing weakly supervised classification in WSI-based pathology diagnosis. However, existing MIL frameworks cannot simultaneously tackle these issues. To address these challenges, we propose an integrated recognition framework comprising three complementary components: a preprocessing selection method, an Efficient Feature Pyramid Network (EFPN) model for multi-instance learning, and a Similarity Focal Loss. The preprocessing selection method accurately identifies and selects representative image patches, effectively reducing invalid data interference and enhancing subsequent model training efficiency. The EFPN model, inspired by pathologists' diagnostic processes, captures different tissue features in WSI images by constructing a multi-scale feature pyramid, enhancing the model's ability to recognize tumor tissue features. Additionally, the Similarity Focal Loss further improves the model's discriminative power and generalization performance by focusing on hard samples and emphasizing classification boundary information. The test accuracy for binary tumor classification on the CAMELYON16 and two private datasets reached 93.58%, 84.74%, and 99.91%, respectively, all of which outperform existing techniques.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1007/s11517-025-03341-x | DOI Listing |