A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 197

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 317
Function: require_once

Weber Ankle Fracture Classification System Yields Greatest Interobserver and Intraobserver Reliability Over AO/OTA and Lauge-Hansen Classification Systems Under Time Constraints in an Asian Population. | LitMetric

Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

No previous studies have evaluated the intra- and interobserver reliability between the Weber, Lauge-Hansen and AO Foundation/Orthopaedic Trauma Association (AO/OTA) classification systems under time constraints. This study compares the interobserver and intraobserver reliability of the aforementioned classification systems under simulated time constraints. Anteroposterior and lateral radiographs of ankle malleolar fractures from 80 consecutive patients from 2015 to 2016 were classified by 2 independent observers according to Weber, Lauge-Hansen and AO/OTA. Classifications were conducted over 4 successive weeks under timed (25-seconds) and untimed conditions, with 1-week gaps between each classification. Cohen's kappa and percentage agreement were calculated. Cohen's kappa for interobserver agreement ranged 0.67 to 0.67 and 0.59 to 0.73 for untimed and timed classifications for Weber; 0.38 to 0.47 and 0.44 to 0.50 for Lauge-Hansen; 0.28 to 0.49 and 0.13 to 0.37 for AO/OTA. Intraobserver agreement ranged from 0.83 to 0.85 and 0.78 to 0.79 for untimed and timed classifications for Weber; 0.46 to 0.65 and 0.59 to 0.73 for Lauge-Hansen; 0.42 to 0.63 and 0.40 to 0.51 for AO/OTA. Based on the Landis and Koch's benchmark scale, there was substantial agreement in the inter- and intraobserver variables for Weber; moderate agreement in inter- and intraobserver variables for Lauge-Hansen; fair and moderate agreement in inter- and intraobserver variables respectively for AO/OTA. Interobserver and intraobserver reliability was the most substantial for Weber, followed by Lauge-Hansen and AO/OTA. Time constraint did not have a statistically significant effect on the reliability of classifications. We recommend concurrent usage of the Weber and Lauge-Hansen system, since they demonstrate the greatest reliability and reproducibility, and confer better understanding of the fracture type, respectively.

Download full-text PDF

Source
http://dx.doi.org/10.1053/j.jfas.2022.12.004DOI Listing

Publication Analysis

Top Keywords

weber lauge-hansen
16
interobserver intraobserver
12
intraobserver reliability
12
classification systems
12
time constraints
12
agreement inter-
12
inter- intraobserver
12
intraobserver variables
12
weber
8
lauge-hansen
8

Similar Publications