Format

Send to

Choose Destination
See comment in PubMed Commons below
J Clin Epidemiol. 1997 Nov;50(11):1189-95.

How reliable is peer review? An examination of operating grant proposals simultaneously submitted to two similar peer review systems.

Author information

1
Heart and Stroke Foundation of Ontario, Toronto, Canada.

Abstract

To determine level of agreement and correlation between two similar but separate peer review systems, proposals simultaneously submitted during the same funding year to two agencies using the same scoring system were identified and analyzed (n = 248). There was a direct linear relationship between the scores of the two agencies (r = 0.592, p < 0.001). Raw agreement within whole-digit ranges was moderate (53%) but a Cohen's kappa indicated that agreement beyond chance was only fair (kappa = 0.29, 95% CI = 0.198, 0.382). When proposals were arbitrarily categorized as being "clearly fundable" (on a 0-5 scale, score > or = 3.0) or "not clearly fundable" (score < 3.0), raw agreement was 73% and agreement beyond chance was moderate (kappa = 0.444, 95% CI = 0.382, 0.552). In cases where there was inter-agency disagreement on the fundability of the project, the difference in scores was greater than in those in which there was agreement. In a subsample of 128 pairs, variables describing the application and the applicant (i.e., principal investigator) were coded, but none explained inter-agency agreement on the "fundability" of proposals.

PMID:
9393374
[Indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Elsevier Science
    Loading ...
    Support Center