A computer-based OSCE station to measure competence in evidence-based medicine skills in medical students

Acad Med. 2002 Nov;77(11):1157-8. doi: 10.1097/00001888-200211000-00022.

Abstract

Objective: To create a feasible, valid, and reliable tool to measure third-year medical students' skills in evidence-based medicine (EBM).

Description: EBM skills-asking clinical questions, finding appropriate medical information resources, and appraising and applying them to patients-involve higher-order critical thinking abilities and are essential to being a competent physician. Students at our institution must pass a required OSCE exam at the end of their third year. As part of this exam, we developed a new 20-minute computer-based station to assess students' EBM skills. Using a specific case scenario, we asked the students to (1) ask a question using the population/intervention/comparison/outcome (PICO) framework; (2) generate appropriate search terms, given a specific question; and (3) select an appropriate abstract to answer a given question and state why two other abstracts were not appropriate. Prior to the assessment, we determined grading and passing criteria for each of the three components and for the station overall. Of the 140 students who completed the station, the percentages that passed the components were 71%, 81%, and 49% respectively, with only 29% passing all three parts. Preliminary analysis of psychometric properties of the station shows very good to excellent interrater reliability, with 65%, 67%, and 94% agreement on the scoring for the components, and kappas of.64,.82, and.94, respectively.

Discussion: Although there are many curricula for teaching EBM concepts, there are few tools to measure whether students are competent in applying their EBM skills. Our pilot station appears to be an innovative and promising tool to measure several EBM skills independently. By being computer-based, it is relatively simple to administer, grade, and evaluate. While preliminary data show good inter-rater reliability with our use of a single case, future work will include further testing of reliability and assessment of different types of cases. We will also use the results of this assessment to drive continuous improvement in our EBM curriculum. The students who completed this pilot station had not received an extensive formal EBM curriculum, whereas future groups will. We also will explore whether scores on our station correlate with those on other OSCE stations that also assess critical thinking skills, or if scores correlate with a student's clinical grades or overall class standing. We hope to test these hypotheses: (1) skills used in EBM are useful and valid measures of critical thinking abilities in learners and (2) tools such as ours will help to measure these essential competencies.

MeSH terms

  • Clinical Competence*
  • Computers*
  • Curriculum
  • Evidence-Based Medicine / education*
  • Humans
  • Psychometrics
  • Students, Medical*