Format

Send to

Choose Destination
Pediatr Rheumatol Online J. 2019 Feb 8;17(1):5. doi: 10.1186/s12969-019-0308-7.

The pediatric rheumatology objective structured clinical examination: progressing from a homegrown effort toward a reliable and valid national formative assessment.

Author information

1
University of Colorado School of Medicine, Children's Hospital of Colorado, 13123 E 16th Avenue, Box 311, Aurora, CO, 80045, USA. megan.curran@childrenscolorado.org.
2
Ann & Robert H. Lurie Children's Hospital of Chicago, 225 E. Chicago Avenue, Box 50, Chicago, IL, 60611, USA.
3
Division of Rheumatology, Perelman School of Medicine, University of Pennsylvania, Children's Hospital of Philadelphia, 3401 Civic Center Blvd., Philadelphia, PA, 19104, USA.

Abstract

BACKGROUND:

Of 37 pediatric rheumatology fellowship training programs in the United States, many have three or fewer fellows at a given time, making large-scale assessment of fellow performance difficult. An objective structured clinical examination (OSCE) is a scenario-based simulation method that assesses individual performance, thus indirectly measuring training program effectiveness. This study describes the development and implementation of two national pediatric rheumatology OSCEs and methods used for programmatic improvement.

METHODS:

OSCEs for pediatric rheumatology fellows were held in 2009 and 2011 during national rheumatology meetings using scenarios and assessment forms originally developed by a fellowship program director. The seven scenarios tested medical knowledge, physical exam and interpersonal skills. Pediatric rheumatologist evaluators assessed fellows' performance using checklists and gave immediate feedback. Program directors were sent summaries of their fellows' performances. Fellows evaluated the OSCE, providing organizational and scenario improvement suggestions. Programmatic changes to the 2011 OSCE were based on 2009 performance data and program evaluation feedback.

RESULTS:

Twenty-two fellows participated in 2009 and 19 in 2011. Performance scores in similar scenarios did not change considerably over the two iterations. In 2009, 85.7% of participants reported desire to change clinical behavior. Assessors' 2009 program evaluation data prompted changes in rating scales and removal of invalid or unreliable assessments. Negative evaluation data about individual stations decreased from 60% in 2009 to 15.4% in 2011. Fellows' ratings of the experience's overall value were similar in 2009 and 2011. The average experience ratings were lower among fellows who proposed scenario-specific improvements and higher among those who recommended organizational improvements.

CONCLUSIONS:

The 2011 examination exhibited programmatic improvement via reduction in fellows' scenario-specific negative feedback. Fellows' overall satisfaction did not change. Further work in scenario selection, assessment validation and inter-rater reliability will improve future pediatric rheumatology OSCEs.

KEYWORDS:

Assessment; Fellowship education; Medical education; Objective structured clinical examination; Program evaluation; Simulation

PMID:
30736800
PMCID:
PMC6367759
DOI:
10.1186/s12969-019-0308-7
[Indexed for MEDLINE]
Free PMC Article

Supplemental Content

Full text links

Icon for BioMed Central Icon for PubMed Central
Loading ...
Support Center