• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of jgimedspringer.comThis journalToc AlertsSubmit OnlineOpen Choice
J Gen Intern Med. Nov 2008; 23(11): 1804–1808.
Published online Sep 4, 2008. doi:  10.1007/s11606-008-0766-y
PMCID: PMC2585665

Impact of an Evidence-Based Medicine Curriculum on Resident Use of Electronic Resources: A Randomized Controlled Study

Abstract

Background

Evidence-based medicine (EBM) is widely taught in residency, but evidence for effectiveness of EBM teaching on changing residents’ behavior is limited.

Objective

To investigate the impact of an EBM curriculum on residents’ use of evidence-based resources in a simulated clinical experience.

Design/Participants

Fifty medicine residents randomized to an EBM teaching or control group.

Measurements

A validated test of EBM knowledge (Fresno test) was administered before and after intervention. Post intervention, residents twice completed a Web-based, multiple-choice instrument (15 items) comprised of clinical vignettes, first without then with access to electronic resources. Use of electronic resources was tracked using ProxyPlus software. Within group pre–post differences and between group post-test differences were examined.

Results

There was more improvement in EBM knowledge (100-point scale) for the intervention group compared to the control group (mean score increase 22 vs. 12,  = 0.012). In the simulated clinical experience, the most commonly accessed resources were Ovid (71% of residents accessed) and InfoPOEMs (62%) for the EBM group and UptoDate (67%) and MDConsult (58%) for the control group. Residents in the EBM group were more likely to use evidence-based resources than the control group. Performance on clinical vignettes was similar between the groups both at baseline ( = 0.19) and with access to information resources ( = 0.89).

Conclusions

EBM teaching improved EBM knowledge and increased use of evidence-based resources by residents, but did not improve performance on Web-based clinical vignettes. Future studies will need to examine impact of EBM teaching on clinical outcomes.

KEY WORDS: evidence-based medicine (EBM), changing residents’ behavior, EBM curriculum

INTRODUCTION

The ability to locate, appraise, and assimilate evidence from scientific studies is an essential skill for physicians, and has been defined as a core competency for graduate medical education by the Accreditation Council for Graduate Medical Education.1 As a result, there is increasing emphasis on teaching evidence-based medicine (EBM) at all levels of medical training.

Most medical schools and residency programs teach EBM in some form: the majority of US internal medicine residency programs have journal clubs2 and many offer a freestanding EBM curriculum.3 Despite widespread teaching of EBM, evidence for effectiveness of EBM teaching is limited. A number of studies have examined whether EBM teaching can improve knowledge, attitudes, or critical appraisal skills,4 but data on more relevant outcomes such as actual EBM use in clinical practice or impact on patient care are limited.5 The few studies that have focused on these outcomes are limited by study design or use of self-reported outcome measures, which is subject to recall bias.6,7

EBM teaching has traditionally focused on critical appraisal of original articles. But critical appraisal is a time consuming process, and studies show that most clinicians fail to look up answers to clinical questions due to time constraints.8 With recent development of evidence-based synopses and summary resources, EBM experts are increasingly emphasizing the use of these pre-appraised resources as a practical tool to support evidence-based clinical practice.9

In this study, we developed an EBM curriculum that focuses on critical appraisal of original articles as well as use of evidence-based summary resources. We assessed the impact of this EBM curriculum on EBM knowledge and use of evidence-based resources. We also sought to determine if EBM teaching improves clinical performance as measured by scores on clinical vignettes. We hypothesized that residents in an EBM teaching group would have higher scores on a test of EBM knowledge than residents in a control group, use evidence-based resources more frequently, and have higher performance on clinical vignettes.

METHODS

We conducted a randomized controlled study of EBM teaching for internal medicine residents at the Robert Wood Johnson University Hospital, a teaching hospital of the Robert Wood Johnson Medical School, New Brunswick, NJ, USA. All second-year and third-year medicine residents ( = 50) were randomized to an EBM teaching group or control group using a computer-generated randomization scheme, stratified by level of training. Residents assigned to EBM teaching participated in six 2-hour workshop sessions (12 hours total) during an elective month. The control group was exposed to some EBM teaching and principles through weekly resident journal clubs, but there was not a formal or explicit curriculum. The institutional review board approved this study.

The EBM curriculum was based on published studies and guides for EBM teaching.1015 The first session focused on formulating clinical questions using the patient, intervention, comparison, outcome method.15 During Session 2, medical librarians introduced residents to electronic evidence-based synopses and summary resources, including Ovid’s Evidence-Based Medicine Reviews (Cochrane Database of Systematic Reviews, ACP Journal Club, Database of Abstracts of Review of Effects), DynaMed, FirstConsult, BMJ Clinical Evidence, and InfoPOEMs. These resources were all accessible via the university library Website, with a few resources (DynaMed, FirstConsult, InfoPOEMs) grouped under a link called “point of care resources.” The remaining sessions focused on topics of therapy, prevention, diagnosis, and prognosis. These sessions were each led by a resident who identified clinical questions from actual patient encounters and performed literature searches using both bibliographic databases as well as evidence-based summary resources to find articles that addressed their question. Residents then presented their findings with critical appraisal of original articles.

A validated test of EBM knowledge (Fresno test of EBM) was administered prior and after intervention to both the EBM teaching and control groups.16 This written test, scored 0–100 with higher scores indicating better knowledge, measures knowledge in question formulation, search strategies, and critical appraisal. Pre-test was administered at the beginning of the academic year for all residents. Residents in the EBM teaching group completed the post-test on the last day of the EBM course. The post-test was administered to residents in the control group during the second half of the academic year.

In order to capture residents’ use of information resources in real time, we developed a simulated clinical experience by creating Web-based clinical vignettes. Clinical vignettes are a valid tool for measuring quality of processes of care in clinical practice,17 and have been shown to correlate with assessment by unannounced standardized patients, an accepted “gold standard.”18 We selected 15 multiple-choice questions based on clinical vignettes from the Medical Knowledge Self-Assessment Program questions CD.19 Study investigators (DM, LW, SK) chose questions to represent realistic but difficult clinical encounters, and to allow an even distribution of questions on diagnosis (3 items), harm (2 items), prevention (3 items), prognosis (3 items), and treatment (4 items).

At the end of the academic year, all residents sequentially completed the 15 clinical vignettes in a computer lab, first without (administration A) then with access to information resources (administration B). Residents were not able to go back and change their previous answers. Use of electronic resources was tracked using ProxyPlus software (Fortech, LTD, Litomysl, Czech Republic), which generated a list of all Websites each resident accessed. Residents were not aware that we were tracking their use of the Internet.

Electronic resources were a priori categorized into three groups. The first category consisted of evidence-based resources, which included bibliographic databases (PubMed, Medline) and select evidence-based summary resources that met criteria as defined by Haynes.9 A summary resource was categorized as an evidence-based resource if it specified and used an explicit process to select, include, and critically appraise all relevant literature to generate summaries and recommendations for clinical practice9, and these resources included ACP Journal Club, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, FirstConsult, BMJ Clinical Evidence, InfoPOEMs, and DynaMed. The second category of resources consisted of review resources, which were defined as resources that may have evidence-based content, but do not use an explicit process to ensure that all relevant literature is selected, appraised, and included9, and these included electronic texts, AccessMedicine, eMedicine, MDConsult, StatRef!, and UptoDate. The third category of resources consisted of non-medical resources, such as Google. The software we were using to track Internet use could not distinguish among the various Ovid resources (Medline, ACP Journal Club, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects), but these were either bibliographic databases or evidence-based synopses, and were, therefore, categorized as evidence-based resources and counted as a single resource.

The primary outcome of this study was frequency of evidence-based resource use. Secondary outcomes were performance on administration B of the clinical vignettes and post-test score on Fresno test of EBM. Outcome assessors were blinded to resident assignment.

Differences between groups were compared with the use of Student’s t-test, chi-square, or Fisher’s exact test, as appropriate. Group means for variables with normal and nonnormal distributions were compared with the use of Student’s t-test and the Mann–Whitney U test, respectively. Pearson correlations examined the relationship between clinical vignette scores and when EBM course was taken.

RESULTS

Twenty-five residents were randomized to the EBM teaching group, and 25 to the control group. Table 1 shows baseline characteristics of study participants. There were no significant differences between the groups. Baseline performance on the Fresno test of EBM was similar between the groups (mean 43.6 (SD = 15.1) for EBM teaching versus 40.3 (SD = 13.4) for control group,  = 0.43). Post-test scores improved for both groups, but were significantly higher for the EBM teaching group (mean score increase 22 (SD = 13.8) for teaching group vs. 12 (SD = 12.2) for control group,  = .012).

Table 1
Participant Characteristics

At the end of the academic year, all 50 residents completed the Web-based clinical vignettes. Data were unavailable for four residents in intervention group due to technical problems, and were excluded from analysis. On average, residents in the EBM teaching group completed administration A of the test in 13:49 (SD = 3:40) minutes and residents in the control group completed it in 14:46 (SD = 3.04) minutes ( = 0.33). Administration B took an average of 27:04 (SD = 12:25) minutes for the EBM teaching group and 31:47 (SD = 11.04) minutes for the control group ( = 0.18). The within group differences between administration A and B times were both significant (p < 0.0001).

All but one resident (in control group) accessed electronic resources to complete administration B. Table 2 shows electronic resources ever accessed by residents in each group. The most commonly accessed resources for the EBM teaching group were Ovid (71% of residents accessed) and InfoPOEMs (62%). The majority of the control group accessed UptoDate (67%) and MDConsult (58%). Non-medical resources, such as Google, were accessed frequently by residents in both groups (62% in the EBM group versus 50% in control group,  = 0.43).

Table 2
Electronic Resources Ever Accessed by Residents

We examined resident use of resources based on categories defined at the beginning of the study (Table 3). There were no differences in the number of times any resource was accessed for the 15 vignettes: median 17 (range 6–24) in the EBM teaching group and 16 (range 8–23) in the control group ( = .71). However, the median number of times any evidence-based resource was accessed was five in EBM teaching group compared to 1 in control group ( = 0.002). Use of review resources was higher in the control group (median 12 in control versus 8 in EBM teaching group,  = 0.004). There was no significant difference in use of non-medical resources ( = 0.43).

Table 3
Median Frequency of Electronic Resource Use, by Category of Resource

Baseline performance (administration A) on Web-based clinical vignettes (scored 0–15) was similar between the groups (mean score 8.9 (SD = 1.8) in EBM teaching group versus 8.2 (SD = 2.2) in control group,  = 0.19). With access to information resources, scores on administration B improved for both groups, but there was no significant difference between the groups (mean score 10.1 (SD = 2.2) in EBM teaching group vs. 10.0 (SD = 2.2) in control group,  = 0.89). In addition, for residents in the EBM teaching group, there was no significant correlation between administration B score and time since EBM teaching ( = 0.38,  = 0.71).

DISCUSSION

This study shows that an EBM curriculum focusing on critical appraisal skills and use of evidence-based resources significantly improved EBM knowledge and changed resident use of information resources. While performance on Web-based clinical vignettes was similar between the groups, residents in the EBM teaching group were more likely to use evidence-based resources.

To our knowledge, this is the first randomized controlled study to assess impact of EBM teaching on residents’ use of information resources by directly tracking their use of the Internet. Prior studies were not randomized, assessed change in self-reported reading habits, or measured number of times residents used Medline, a limited assessment of electronic resource use.4,20 Our findings are consistent with observations from prior non-randomized studies that show EBM teaching can improve EBM knowledge and increase use of evidence-based resources.

A surprising finding in our study was the frequent use of non-medical resources, such as Google, in both groups. Use of non-medical search engines by clinicians has been previously reported,21 but their frequency of use in clinical practice has not been studied. In this study, a high number of residents (56%) used non-medical search engines at least once. Given the ease of access and wide availability, search engines such as Google are an attractive resource, and often can be an efficient way to locate an original study or information on new or rare products. In addition, a recent study investigated the utility of Google as a diagnostic aid and found that Google searches led to the correct diagnosis in 58% of cases.22 We were not able to assess the exact content of information our residents found through these sources, but the frequent use of non-medical search engines by residents does raise some concerns because information on such Websites is not critically appraised or filtered in any systematic manner, and content of many evidence-based synopses or summary resources (most of which require subscription) are not available through Google.

Prior studies have suggested EBM teaching may have a positive impact on patient care. A recent study found that an EBM training course was associated with improved rates of evidence-based therapy use, but was limited by its before/after study design.6 Other studies have assessed impact of real-time EBM rounds on patient care, but these studies lacked a control group, and impact of EBM rounds was assessed by learner perceived change in medical management rather than actual patient outcomes.23,24

This study assessed the impact of EBM teaching on a simulated clinical experience by using Web-based clinical vignettes. In our study, EBM teaching did not improve performance on clinical vignettes. Several factors likely account for this finding. First, we chose questions that were thought to be difficult; if a resident did not understand the question, having access to resources may not improve their scores. This may be reflected by lack of dramatic score improvement between administration A and B of the clinical vignettes. Second, although residents in the EBM teaching group did access evidence-based resources more frequently than the control group, the majority of their searches remained in review resources. This plus the small number of questions administered likely limited our ability to detect small changes on test scores. In addition, currently available evidence-based resources have limitations: bibliographic databases such as PubMed and Medline are difficult to navigate, and current evidence-based synopses or summary sources have small databases that cover a limited content. Finally, because we wanted the clinical vignettes to mirror real practice, we selected questions that represented realistic clinical encounters where there may not be evidence-based data to support clinical decisions or answers may be found in multiple resources and not solely in an evidence-based resource. In fact, while resources such as UptoDate and eMedicine may not consistently use an explicit evidence-based review process, they often provide similar recommendations for patient management as evidence-based resources.

This study has additional limitations. First, this is a single institution study with a limited number of participants which limits generalizability. Second, because the Fresno test of EBM knowledge was administered at the conclusion of the EBM course, it is not possible to know if the improved knowledge displayed by the EBM teaching group was sustained through the year. Third, the EBM curriculum introduced residents to some but not all available evidence-based summary resources (such as Physician’s Information and Education Resource (PIER) by the American College of Physicians). It is possible that use of other resources may have impacted performance on clinical vignettes. Finally, we assessed impact of EBM teaching on clinical care my measuring performance on clinical vignettes rather than actual patient encounters.

Our study has several strengths. First, this was a randomized controlled study. Second, because our study participants were not aware that we were tracking their use of the Internet, we believe our findings reflect their actual use of information resources in real practice. In addition, using Web-based clinical vignettes allowed us to assess individual patterns without influence from other healthcare members. In real clinical practice, residents almost always deliver care as part of a team (with faculty preceptors, ward attendings or consultants); thus use of actual patient encounters would not have allowed us to measure a resident’s individual search strategy.

In summary, EBM teaching can significantly improve EBM knowledge and increase use of evidence-based resources by residents. Future studies are needed to clarify the impact of EBM teaching in improving clinical performance or patient outcomes.

Acknowledgement

We thank John Ellingsworth for developing a computerized survey to administer the clinical vignettes, Robert Cupryk for library orientation to residents, and Fengzhi Fan for his assistance with tracking residents’ Internet use. Finally, we thank the internal medicine residents for their participation in this study.

Conflicts of Interest None disclosed.

Funding This study was funded in part by the Division of General Internal Medicine, Robert Wood Johnson Medical School.

References

1. Accreditation Council for Graduate Medical Education. General Competencies. (http://www.acgme.org/outcome/comp/compFull.asp). Accessed 8/13/08.
2. Sidorov J. How are internal medicine residency journal clubs organized, and what makes them successful. Arch Intern Med. 1995;155:1193–7. [PubMed]
3. Green ML. Evidence-based medicine training in internal medicine residency programs: a national survey. J Gen Intern Med. 2000;15:129–33. [PMC free article] [PubMed]
4. Coomarasamy A, Khan KS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004;329:1017–21. [PMC free article] [PubMed]
5. Hatala R, Guyatt G. Evaluating the teaching of evidence-based medicine. JAMA. 2002;288:1110–2. [PubMed]
6. Straus SE, Ball C, Balcombe N, Sheldon J, McAlister FA. Teaching evidence-based medicine skills can change practice in a community hospital. J Gen Intern Med. 2005;20:340–3. [PMC free article] [PubMed]
7. Linzer M, Brown JT, Frazier LM, Delong ER, Siegel WC. Impact of a medical journal club on house-staff reading habits, knowledge, and critical appraisal skills. A randomized controlled trial. JAMA. 1988;260:2537–41. [PubMed]
8. Green ML, Ciampi MA, Ellis PJ. Residents’ medical information needs in clinic: are they being met. Am J Med. 2000;109:218–23. [PubMed]
9. Haynes RB. Of studies, syntheses, synopses, and systems: the “4S” evolution of services for finding current best evidence. ACP J Club. 2001;134:A11–A13. [PubMed]
10. Guyatt G, Rennie D, ed. Users’ guides to the medical literature. Chicago: AMA Press, 2002.
11. Green ML. Graduate medical education training in clinical epidemiology, critical appraisal, and evidence-based medicine: a critical review of curricula. Acad Med. 1999;74:686–94. [PubMed]
12. Green ML, Ellis PJ. Impact of an evidence-based medicine curriculum based on adult learning theory. J Gen Intern Med. 1997;12:742–50. [PMC free article] [PubMed]
13. Ross R, Verdieck A. Introducing an evidence-based medicine curriculum into a family practice residency-is it effective? Acad Med. 2003;78:412–7. [PubMed]
14. Smith CA, Ganschow PS, Reilly BM, Evans AT, McNutt RA, Osei A, et al. Teaching residents evidence-based medicine skills, a controlled trial of effectiveness and assessment of durability. J Gen Intern Med. 2000;15:710–5. [PMC free article] [PubMed]
15. Richardson WS, Wilson MC, Nishikawa J, Hayward RSA. The well-built clinical question: a key to evidence-based decisions. ACP J Club. 1995;123:A12–3. [PubMed]
16. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003;326:319–21. [PMC free article] [PubMed]
17. Norcini JJ, Swanson DB, Gross LJ, Webster GD. Reliability, validity, and efficiency of multiple choice question and patient management problem item formats in assessment of clinical competence. Med Educ. 1985;19:238–47. [PubMed]
18. Peabody JW, Luck J, Glassman P, Jain S, Hansen J, Spell M, et al. Measuring the quality of physician practice by using clinical vignettes: a prospective validation study. Ann Intern Med. 2004;141:771–80. [PubMed]
19. American College of Physicians. Prep for Boards 2, an enhancement to MKSAP. Philadelphia: American College of Physicians; 2005.
20. Cabel CH, Schardt C, Sanders L, Corey GR, Keltz SA. Resident utilization of information technology, a randomized trial of clinical question formation. J Gen Intern Med. 2001;16:838–44. [PMC free article] [PubMed]
21. Greenwald R. And a diagnostic test was performed. N Engl J Med. 2005;353:2089–90. [PubMed]
22. Tang H, Ng JHK. Googling for a diagnosis- use of Google as a diagnostic aid: Internet based study. BMJ. 2006;333:1143–5. [PMC free article] [PubMed]
23. McGinn T, Seltz M, Korenstein D. A method for real-time, evidence-based general medical attending rounds. Acad Med. 2002;77:1150–2. [PubMed]
24. Sackett DL, Straus SE. Finding and applying evidence during clinical rounds. The “Evidence Cart. JAMA. 1998;280:336–8. [PubMed]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

  • MedGen
    MedGen
    Related information in MedGen
  • PubMed
    PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...