Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
Continual Learning (CL) is recognized to be a storage-efficient and privacy-protecting approach for learning from sequentially-arriving medical sites. However, most existing CL methods assume that each site is fully labeled, which is impractical due to budget and expertise constraint. This paper studies the Semi-Supervised Continual Learning (SSCL) that adopts partially-labeled sites arriving over time, with each site delivering only limited labeled data while the majority remains unlabeled. In this regard, it is challenging to effectively utilize unlabeled data under dynamic cross-site domain gaps, leading to intractable model forgetting on such unlabeled data. To address this problem, we introduce a novel Disentanglement-guided Consolidation and Consistency Training (DC2T) framework, which roots in an Online Semi-Supervised representation Disentanglement (OSSD) perspective to excavate content representations of partially labeled data from sites arriving over time. Moreover, these content representations are required to be consolidated for site-invariance and calibrated for style-robustness, in order to alleviate forgetting even in the absence of ground truth. Specifically, for the invariance on previous sites, we retain historical content representations when learning on a new site, via a Content-inspired Parameter Consolidation (CPC) method that prevents altering the model parameters crucial for content preservation. For the robustness against style variation, we develop a Style-induced Consistency Training (SCT) scheme that enforces segmentation consistency over style-related perturbations to recalibrate content encoding. We extensively evaluate our method on fundus and cardiac image segmentation, indicating the advantage over existing SSCL methods for alleviating forgetting on unlabeled data.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TMI.2024.3469528 | DOI Listing |