A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: Network is unreachable

Filename: helpers/my_audit_helper.php

Line Number: 197

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 317
Function: require_once

Session-Guided Attention in Continuous Learning With Few Samples. | LitMetric

Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Few-shot class-incremental learning (FSCIL) aims to learn from a sequence of incremental data sessions with a limited number of samples in each class. The main issues it encounters are the risk of forgetting previously learned data when introducing new data classes, as well as not being able to adapt the old model to new data due to limited training samples. Existing state-of-the-art solutions normally utilize pre-trained models with fixed backbone parameters to avoid forgetting old knowledge. While this strategy preserves previously learned features, the fixed nature of the backbone limits the model's ability to learn optimal representations for unseen classes, which compromises performance on new class increments. In this paper, we propose a novel SEssion-Guided Attention framework (SEGA) to tackle this challenge. SEGA exploits the class relationships within each incremental session by assessing how test samples relate to class prototypes. This allows accurate incremental session identification for test data, leading to more precise classifications. In addition, an attention module is introduced for each incremental session to further utilize the feature from the fixed backbone. As the session of the testing image is determined, we can fine-tune the feature with the corresponding attention module to better cluster the sample within the selected session. Our approach adopts the fixed backbone strategy to avoid forgetting the old knowledge while achieving novel data adaptation. Experimental results on three FSCIL datasets consistently demonstrate the superior adaptability of the proposed SEGA framework in FSCIL tasks. The code is available at: https://github.com/zichengpan/SEGA.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TIP.2025.3559463DOI Listing

Publication Analysis

Top Keywords

fixed backbone
12
incremental session
12
session-guided attention
8
avoid forgetting
8
forgetting knowledge
8
attention module
8
data
6
session
5
attention continuous
4
continuous learning
4

Similar Publications