Examiners and content and site: Oh My! A national organization's investigation of score variation in large-scale performance assessments

Adv Health Sci Educ Theory Pract. 2015 Aug;20(3):581-94. doi: 10.1007/s10459-014-9547-z. Epub 2014 Aug 28.

Abstract

Examiner effects and content specificity are two well known sources of construct irrelevant variance that present great challenges in performance-based assessments. National medical organizations that are responsible for large-scale performance based assessments experience an additional challenge as they are responsible for administering qualification examinations to physician candidates at several locations and institutions. This study explores the impact of site location as a source of score variation in a large-scale national assessment used to measure the readiness of internationally educated physician candidates for residency programs. Data from the Medical Council of Canada's National Assessment Collaboration were analyzed using Hierarchical Linear Modeling and Rasch Analyses. Consistent with previous research, problematic variance due to examiner effects and content specificity was found. Additionally, site location was also identified as a potential source of construct irrelevant variance in examination scores.

MeSH terms

  • Bias*
  • Clinical Competence* / statistics & numerical data
  • Educational Measurement / standards*
  • Female
  • Humans
  • Male
  • Models, Statistical
  • Physicians*