Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1075
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3195
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
Background: The formulation of expert opinion guidelines has several sources of bias that may adversely affect their quality. To minimize bias, guideline creators must use rigorous methodology. There has been no appraisal of the methodologic quality of basic critical care echocardiography (BCCE) training/education guidelines.
Research Question: What is the methodologic quality of expert guidelines/recommendations on BCCE training?
Study Design And Methods: The review was performed by a multidisciplinary team including intensive care specialists, a hospital scientist, a trainee, a nurse sonographer, and a public health expert. Four databases (PubMed, OVID-Embase, Clarivate Analytics Web of Science, and Google Scholar) were searched on July 31, 2020, to identify guidelines on BCCE training/education. Every guideline was assessed subjectively for the degree of detail of the recommendations and assessed objectively by using the AGREE-II critical appraisal tool for clinical practice guidelines to generate a scaled domain score. A score ≥ 75% in every domain was the cut off for guidelines to be used without modifications.
Results: From 4,288 abstracts screened, 24 guidelines met the inclusion criteria. Very few guidelines made clear recommendations regarding introductory courses: physics (n = 6 [25%]), instrumentation (n = 5 [20.8%]), image acquisition theory (n = 6 [25%]), course curriculum (n = 5 [[20.8%]), pre-course/post-course tests (n = 1 [4.2%]), minimum course duration (n = 6 [25%]), or trainer qualifications (n = 5 [20.8%]). Very few provided clear recommendations for longitudinal competence programs: clinically indicated scans (n = 8 [33.3%]), logbook (n = 14 [58.3%]), image storage (n = 9 [37.5%]), formative assessment (n = 6 [25%]), minimum scan numbers (n = 14 [58.3%]), image acquisition competence (n = 3 [12.5%]), image interpretation competence (n = 2 [8.3%]), and credentialing/certification (n = 3 [12.5%]). Five guidelines (20.8%) attained a scaled overall AGREE-II score ≥ 75%. One guideline (4.2%) attained scores ≥ 75% in every domain.
Interpretation: The methodologic appraisal of BCCE-training guidelines showed widespread deficiencies in guideline formulation processes. The impact of these deficiencies on the validity of the recommendations requires further evaluation in longitudinal studies.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.chest.2021.02.020 | DOI Listing |