Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
Introduction: The Royal Australian and New Zealand College of Radiologists (RANZCR) has developed a reporting template to assist in the categorization of COVID-19 in chest X-ray (CXR) images and the levels of COVID-19 infection. Whilst CXRs are reported by radiologists, radiographers are often the first to assess the CXRs, and have the potential to support immediate triaging of patients with COVID-19. However, inter-reader concordance in the use of this reporting template remains underexplored.
Methods: 70 CXR examinations comprising of the four categories in the RANZCR chest X-ray (CXR) COVID-19 reporting template were used for the study. These included: 'typical' (for COVID-19) (n = 30); 'indeterminate' (for COVID-19) (n = 20); 'other diagnoses favoured' (n = 10) and 'normal' (n = 10). These images were independently categorised using the RANZCR reporting template by three cohorts of readers: 12 radiologists, 13 registered radiographers, and 12 final-year radiography students. A Weighted Kappa (κ) was used to evaluate inter-reader agreement within and between the cohort of readers.
Results: Radiologists demonstrated fair (κ = 0.32) to substantial (κ = 0.77) inter-reader agreement, and their overall inter-reader was moderate (κ = 0.56). Registered radiographers demonstrated no (κ = -0.01) to moderate agreement (κ = 0.59), and their overall agreement was fair (κ = 0.31). Fourth year student radiographers demonstrated slight (κ = 0.004) to substantial (κ 0.8) agreement, with a moderate (κ = 0.47) overall agreement among final year student radiographers.
Conclusion: There are wide variations in the classification of the CXRs using the RANZCR reporting template. Overall, radiologists exhibit superior concordance in CXR categorization using the COVID-19 reporting template. Radiographers demonstrate wide variability, highlighting the need for enhanced education and training to standardise the triaging of these patients undergoing CXR imaging for COVID-19 symptoms.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.jmir.2025.101911 | DOI Listing |