Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Background: Accurate segmentation of gastric tumors from CT scans provides useful image information for guiding the diagnosis and treatment of gastric cancer. However, automated gastric tumor segmentation from 3D CT images faces several challenges. The large variation of anisotropic spatial resolution limits the ability of 3D convolutional neural networks (CNNs) to learn features from different views. The background texture of gastric tumor is complex, and its size, shape and intensity distribution are highly variable, which makes it more difficult for deep learning methods to capture the boundary. In particular, while multi-center datasets increase sample size and representation ability, they suffer from inter-center heterogeneity.

Methods: In this study, we propose a new cross-center 3D tumor segmentation method named Hierarchical Class-Aware Domain Adaptive Network (HCA-DAN), which includes a new 3D neural network that efficiently bridges an Anisotropic neural network and a Transformer (AsTr) for extracting multi-scale context features from the CT images with anisotropic resolution, and a hierarchical class-aware domain alignment (HCADA) module for adaptively aligning multi-scale context features across two domains by integrating a class attention map with class-specific information. We evaluate the proposed method on an in-house CT image dataset collected from four medical centers and validate its segmentation performance in both in-center and cross-center test scenarios.

Results: Our baseline segmentation network (i.e., AsTr) achieves best results compared to other 3D segmentation models, with a mean dice similarity coefficient (DSC) of 59.26%, 55.97%, 48.83% and 67.28% in four in-center test tasks, and with a DSC of 56.42%, 55.94%, 46.54% and 60.62% in four cross-center test tasks. In addition, the proposed cross-center segmentation network (i.e., HCA-DAN) obtains excellent results compared to other unsupervised domain adaptation methods, with a DSC of 58.36%, 56.72%, 49.25%, and 62.20% in four cross-center test tasks.

Conclusions: Comprehensive experimental results demonstrate that the proposed method outperforms compared methods on this multi-center database and is promising for routine clinical workflows.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11107051PMC
http://dx.doi.org/10.1186/s40644-024-00711-wDOI Listing

Publication Analysis

Top Keywords

hierarchical class-aware
12
class-aware domain
12
gastric tumor
12
tumor segmentation
12
cross-center test
12
domain adaptive
8
adaptive network
8
segmentation
8
segmentation images
8
network hca-dan
8

Similar Publications

In the Imbalanced Multivariate Time Series Classification (ImMTSC) task, minority-class instances typically correspond to critical events, such as system faults in power grids or abnormal health occurrences in medical monitoring. Despite being rare and random, these events are highly significant. The dynamic spatial-temporal relationships between minority-class instances and other instances make them more prone to interference from neighboring instances during classification.

View Article and Find Full Text PDF

In optical remote sensing image object detection, discontinuous boundaries often limit detection accuracy, particularly at high Intersection over Union (IoU) thresholds. This paper addresses this issue by proposing the Spatial Adaptive Angle-Aware (SA3) Network. The SA3 Network employs a hierarchical refinement approach, consisting of coarse regression, fine regression, and precise tuning, to optimize the angle parameters of rotated bounding boxes.

View Article and Find Full Text PDF
Article Synopsis
  • Accurate segmentation of gastric tumors from CT scans is crucial for effective diagnosis and treatment of gastric cancer, but it faces challenges like varying resolution and complex tumor characteristics.
  • The study introduces a new segmentation method called Hierarchical Class-Aware Domain Adaptive Network (HCA-DAN), which combines a 3D neural network and a Transformer to effectively extract features from 3D CT images while addressing cross-center data variations.
  • Results show that HCA-DAN outperforms other segmentation models, achieving higher mean dice similarity coefficients in both in-center and cross-center tests, indicating promising performance in accurately identifying gastric tumors.
View Article and Find Full Text PDF