A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 197

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 317
Function: require_once

GRN+: a simplified generative reinforcement network for tissue layer analysis in 3D ultrasound images for chronic low-back pain. | LitMetric

Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Purpose: 3D ultrasound delivers high-resolution, real-time images of soft tissues, which are essential for pain research. However, manually distinguishing various tissues for quantitative analysis is labor-intensive. We aimed to automate multilayer segmentation in 3D ultrasound volumes using minimal annotated data by developing generative reinforcement network plus (GRN+), a semi-supervised multi-model framework.

Approach: GRN+ integrates a ResNet-based generator and a U-Net segmentation model. Through a method called segmentation-guided enhancement (SGE), the generator produces new images under the guidance of the segmentation model, with its weights adjusted according to the segmentation loss gradient. To prevent gradient explosion and secure stable training, a two-stage backpropagation strategy was implemented: the first stage propagates the segmentation loss through both the generator and segmentation model, whereas the second stage concentrates on optimizing the segmentation model alone, thereby refining mask prediction using the generated images.

Results: Tested on 69 fully annotated 3D ultrasound scans from 29 subjects with six manually labeled tissue layers, GRN+ outperformed all other semi-supervised methods in terms of the Dice coefficient using only 5% labeled data, despite not using unlabeled data for unsupervised training. In addition, when applied to fully annotated datasets, GRN+ with SGE achieved a 2.16% higher Dice coefficient while incurring lower computational costs compared to other models.

Conclusions: GRN+ provides accurate tissue segmentation while reducing both computational expenses and the dependency on extensive annotations, making it an effective tool for 3D ultrasound analysis in patients with chronic lower back pain.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12310559PMC
http://dx.doi.org/10.1117/1.JMI.12.4.044001DOI Listing

Publication Analysis

Top Keywords

segmentation model
16
generative reinforcement
8
reinforcement network
8
segmentation
8
segmentation loss
8
fully annotated
8
dice coefficient
8
grn+
6
ultrasound
5
grn+ simplified
4

Similar Publications