A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 197

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 317
Function: require_once

Midwives' visual interpretation of intrapartum cardiotocographs: intra- and inter-observer agreement. | LitMetric

Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Aim: This paper reports an examination of intra- and inter-observer agreement in midwives' visual interpretation of intrapartum cardiotocographs (CTGs).

Background: The issue of intra- and inter-observer agreement in the interpretation of CTG interpretation has serious implications for the validity of electronic fetal heart rate monitoring and subsequent decisions on intrapartum management. However, no studies were found that assessed intra- and inter-observer agreement in midwives' interpretations of CTG tracings.

Methods: Twenty-eight midwives independently interpreted three intrapartum CTG tracings on two separate occasions using a self-administered Cardiotocograph Interpretation Skills Test. Inter-rater agreement in interpretation was assessed by cross-tabulating the two sets of raw data obtained at time 1 and time 2 and computing Cohen's Kappa (kappa). Intra-rater agreement was assessed by computing kappa for each rater with the two sets of raw data (time 1 and time 2) obtained from each individual. The data were collected in 2000.

Results: Overall intra-rater agreement ranged from 'fair to good' (kappa = 0.48) to 'excellent' (kappa = 0.92). Raters' classifications altered in 18% (n = 5) of cases for the normal tracing, in 29% (n = 8) for the suspicious tracing and in 11% (n = 3) for the pathological tracing. Inter-rater agreement was fair to good, with kappa statistics ranging from 0.65 to 0.74, respectively. Agreement was highest in the classification of decelerations (kappa = 0.79) and lowest in the assessment of baseline variability (kappa = 0.50). Overall inter-rater agreement was highest in the suspicious tracing (kappa = 0.77, excellent) and lowest in the normal tracing (kappa = 0.54, fair to good).

Conclusion: Inter- and intra-observer variability are intrinsic characteristics of the interpretation of intrapartum CTGs. Levels of agreement revealed degrees of variation that expose room for improvement. Efforts are needed to reduce inter- and intra-observer variation in interpretation of intrapartum CTG tracings. In addition, research should focus on the development and evaluation of non-invasive, low observer variability methods of intrapartum assessment of fetal well-being. The subjectivity of CTG interpretation and inconsistencies in interpretation should also be considered in intrapartum management, clinical audit and in medico-legal settings.

Download full-text PDF

Source
http://dx.doi.org/10.1111/j.1365-2648.2005.03575.xDOI Listing

Publication Analysis

Top Keywords

interpretation intrapartum
16
intra- inter-observer
16
inter-observer agreement
16
inter-rater agreement
12
agreement
11
interpretation
10
kappa
10
midwives' visual
8
visual interpretation
8
intrapartum
8

Similar Publications