A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 197

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1075
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3195
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 317
Function: require_once

Registered report protocol: Factors associated with inter-rater agreement in grant peer review. | LitMetric

Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Grant peer review processes are pivotal in allocating substantial research funding, yet concerns about their reliability persist, primarily due to low inter-rater agreement. This study aims to examine factors associated with agreement among peer reviewers in grant evaluations, leveraging data from 134,991 reviews across four Norwegian research funders. Using a cross-classified linear regression model, we will explore the relationship between inter-rater agreement and multiple factors, including reviewer similarity, experience, expertise, research area, application characteristics, review depth, and temporal trends. Our findings are expected to shed light on whether similarity between reviewers (gender, age), their experience, or expertise correlates with higher agreement. Additionally, we investigate whether characteristics of the applications-such as funding amount, research area, or variability in project size-affect agreement levels. By analyzing applications from diverse disciplines and funding schemes, this study aims to provide a comprehensive understanding of the drivers of inter-rater agreement and their implications for grant peer review reliability. The results will inform improvements to peer review processes, enhancing the fairness and validity of funding decisions. All data and analysis scripts will be publicly available, ensuring transparency and reproducibility.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12124745PMC
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0322696PLOS

Publication Analysis

Top Keywords

inter-rater agreement
16
peer review
16
grant peer
12
factors associated
8
review processes
8
study aims
8
experience expertise
8
agreement
7
peer
5
review
5

Similar Publications