Send to

Choose Destination
Acad Emerg Med. 2011 Oct;18 Suppl 2:S110-20. doi: 10.1111/j.1553-2712.2011.01160.x.

Systems-based practice: Summary of the 2010 Council of Emergency Medicine Residency Directors Academic Assembly Consensus Workgroup--teaching and evaluating the difficult-to-teach competencies.

Author information

Department of Emergency Medicine, NorthShore University HealthSystem Research Institute (HD), NorthShore University HealthSystem, Evanston, IL, USA.



The development of robust Accreditation Council for Graduate Medical Education (ACGME) systems-based practice (SBP) training and validated evaluation tools has been generally challenging for emergency medicine (EM) residency programs. The purpose of this paper is to report the results of a consensus workgroup session of the 2010 Council of Emergency Medicine Residency Directors (CORD) Academic Assembly with the following objectives: 1) to discuss current and preferred local and regional methods for teaching and assessing SBP and 2) to develop consensus within the CORD community using the modified Delphi method with respect to EM-specific SBP domains and link these domains to specific SBP educational and evaluative methods.


Consensus was developed using a modified Delphi method. Previously described taxonomy generation methodology was used to create a SBP taxonomy of EM domain-specific knowledge, skills, and attitudes (KSA). The steps in the process consisted of: 1) an 11-question preconference survey, 2) a vetting process conducted at the 2010 CORD Academic Assembly, and 3) the development and ranking of domain-specific SBP educational activities and evaluation criteria for the specialty of EM.


Rank-order lists were created for preferred SBP education and evaluation methods. Expert modeling, informal small group discussion, and formal small group activities were considered to be the optimal methods to teach SBP. Kruskal-Wallis testing revealed that these top three items were rated significantly higher than self-directed learning projects and lectures (p = 0.0317). Post hoc test via permutation testing revealed that the difference was significant between expert modeling and formal small group activity (adjusted p = 0.028), indicating that expert modeling was rated significantly higher than formal small group activity. Direct observation methods were the preferred methods for evaluation. Multiple barriers to training and evaluation were elucidated. We developed a consensus taxonomy of domains that were felt to be most essential and reflective of the practice of EM: multitasking, disposition, and patient safety. Learning formats linked to the domains were created and specific examples of local best practices collected. Domain-specific anchors of observable actions for the three domains were created.


This consensus process resulted in the development of a taxonomy of EM-specific domains for teaching and observable tasks for evaluating SBP. The concept of SBP is interlinked with the other general competencies and difficult to separate. Rather than develop specific SBP evaluation tools to measure the competency directly, SBP competency evaluation should be considered one element of a coordinated effort to teach and evaluate the six ACGME general competencies.

[Indexed for MEDLINE]
Free full text

Supplemental Content

Full text links

Icon for Wiley
Loading ...
Support Center