The validity and reliability of the sixth-year internal medical examination administered at the King Abdulaziz University Medical College

BMC Med Educ. 2015 Feb 1:15:10. doi: 10.1186/s12909-015-0295-4.

Abstract

Background: Exams are essential components of medical students' knowledge and skill assessment during their clinical years of study. The paper provides a retrospective analysis of validity evidence for the internal medicine component of the written and clinical exams administered in 2012 and 2013 at King Abdulaziz University's Faculty of Medicine.

Methods: Students' scores for the clinical and written exams were obtained. Four faculty members (two senior members and two junior members) were asked to rate the exam questions, including MCQs and OSCEs, for evidence of content validity using a rating scale of 1-5 for each item. Cronbach's alpha was used to measure the internal consistency reliability. Correlations were used to examine the associations between different forms of assessment and groups of students.

Results: A total of 824 students completed the internal medicine course and took the exam. The numbers of rated questions were 320 and 46 for the MCQ and OSCE, respectively. Significant correlations were found between the MCQ section, the OSCE section, and the continuous assessment marks, which include 20 long-case presentations during the course; participation in daily rounds, clinical sessions and tutorials; the performance of simple procedures, such as IV cannulation and ABG extraction; and the student log book. Although the OSCE exam was reliable for the two groups that had taken the final clinical OSCE, the clinical long- and short-case exams were not reliable across the two groups that had taken the oral clinical exams. The correlation analysis showed a significant linear association between the raters with respect to evidence of content validity for both the MCQ and OSCE, r = .219 P < .001 and r = .678 P < .001, respectively, and r = .241 P < .001 and r = .368 P = .023 for the internal structure validity, respectively. Reliability measured using Cronbach's alpha was greater for assessments administered in 2013.

Conclusion: The pattern of relationships between the MCQ and OSCE scores provides evidence of the validity of these measures for use in the evaluation of knowledge and clinical skills in internal medicine. The OSCE exam is more reliable than the short- and long-case clinical exams and requires less effort on the part of examiners and patients.

MeSH terms

  • Competency-Based Education / organization & administration
  • Education, Medical / organization & administration*
  • Educational Measurement*
  • Female
  • Humans
  • Internal Medicine / education*
  • Male
  • Reproducibility of Results
  • Retrospective Studies
  • Saudi Arabia