Essential steps in developing best practices to assess reflective skill: A comparison of two rubrics

Med Teach. 2016;38(1):75-81. doi: 10.3109/0142159X.2015.1034662. Epub 2015 Apr 29.

Abstract

Purpose: Medical education lacks best practices for evaluating reflective writing skill. Reflection assessment rubrics include the holistic, reflection theory-based Reflection-on-Action and the analytic REFLECT developed from both reflection and narrative-medicine literatures. To help educators move toward best practices, we evaluated these rubrics to determine (1) rater requirements; (2) score comparability; and (3) response to an intervention.

Methods: One-hundred and forty-nine third-year medical students wrote reflections in response to identical prompts. Trained raters used each rubric to score 56 reflections, half written with structured guidelines and half without. We used Pearson's correlation coefficients to associate overall rubric levels and independent t-tests to compare structured and unstructured reflections.

Results: Reflection-on-Action training required for two hours; two raters attained an interrater-reliability = 0.91. REFLECT training required six hours; three raters achieved an interrater-reliability = 0.84. Overall rubric correlation was 0.53. Students given structured guidelines scored significantly higher (p < 0.05) on both rubrics.

Conclusions: Reflection-on-Action and REFLECT offer unique educational benefits and training challenges. Reflection-on-Action may be preferred for measuring overall quality of reflection given its ease of use. Training on REFLECT takes longer but it yields detailed data on multiple dimensions of reflection that faculty can reference when providing feedback.

MeSH terms

  • Education, Medical / organization & administration*
  • Education, Medical / standards
  • Educational Measurement / methods*
  • Educational Measurement / standards*
  • Humans
  • Observer Variation
  • Psychometrics
  • Reproducibility of Results
  • Writing / standards*