• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of jrsocmedLink to Publisher's site
J R Soc Med. Oct 1, 2008; 101(10): 493–500.
PMCID: PMC2586873

Tips for teaching evidence-based medicine in a clinical setting: lessons from adult learning theory. Part one

Kausik Das, Consultant Obstetrician & Gynaecologist (Locum),1 Sadia Malick, Honorary Clinical Lecturer,2 and Khalid S Khan, Professor of Obstetrics-Gynaecology and Clinical Epidemiology, Director WHO Collaborating Centre for Research Synthesis in Reproductive Health, Director of R&D and Honorary Consultant Obstetrician-Gynaecologist3

Summary

Evidence-based medicine (EBM) is an indispensable tool in clinical practice. Teaching and training of EBM to trainee clinicians is patchy and fragmented at its best. Clinically integrated teaching of EBM is more likely to bring about changes in skills, attitudes and behaviour. Provision of evidence-based health care is the most ethical way to practice, as it integrates up-to-date, patient-oriented research into the clinical decision making process, thus improving patients' outcomes. In this article, we aim to dispel the myth that EBM is an academic and statistical exercise removed from practice by providing practical tips for teaching the minimum skills required to ask questions and critically identify and appraise the evidence and presenting an approach to teaching EBM within the existing clinical and educational training infrastructure.

Introduction

Evidence-based medicine (EBM), defined as the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients,1 provides clinicians with a means to integrate continuing education and practice improvement into the day-to-day work of their professional lives. Formally, itinvolves the process of acquiring, systematically reviewing, appraising and applying research findings to aid the delivery of optimum clinical care to patients. It does not exclude clinical expertise but actively encourages incorporating research evidence to make decisions. This is something we have always done as clinicians, except now we have much better tools (computers, internet) with which to harness evidence for EBM and it is increasingly becoming one of the major driving forces in the NHS. It has an impact on resource allocation, delivery and provision of healthcare, policy making and research.2 This means that education and training, both in postgraduate and continuing education, need to develop methods of teaching EBM to trainees.

Despite various criticisms, EBM provides us a way of keeping our practice up to date in the uncertain world of medicine.3 Not everybody needs to appraise evidence from scratch, but we all invariably need some skills in EBM to remain up-to-date with new research.4 The need to develop a curriculum outlining the minimum standard requirements for training health professionals in EBM is well recognized.5 The challenge is to engage the healthcare profession in learning EBM and making it part of routine clinical practice.

To teach EBM to trainees, it is crucial to keep the following principles of adult learning theory in mind – the trainees are responsible adults, and need to be taught as such:6

  • Adults will commit to learning when the goals and objectives are considered realistic and important to them. Application in the ‘real world’ is important and relevant to the adult learner's personal and professional needs.
  • Adult learners need to see that the professional development learning and their day-to-day activities are related and relevant.
  • Adult learners need direct, concrete experiences in which they apply the learning in real work.
  • Adult learning has ego involved. Professional development must be structured to provide support from peers and to reduce the fear of judgment during learning.
  • Adults need to receive feedback on how they are doing and the results of their efforts. Opportunities must be built into professional development activities that allow the learner to practice what they have learned and to receive structured, helpful feedback.
  • Adults need to participate in small group activities during learning to move them beyond understanding to application, analysis, synthesis and evaluation. Small group activities provide an opportunity to share, reflect and generalize learning experiences.
  • Adult learners come to learning with a wide range of previous experiences, knowledge, self-direction, interests and competencies. This diversity must be accommodated in the professional development planning.
  • Transfer of learning for adults is not automatic and must be facilitated. Coaching and other kinds of follow-up support are needed to help adult learners transfer learning into daily practice so that it is sustained.

There is empirical evidence that the outcomes of teaching EBM differ markedly between undergraduates and postgraduates, with smaller gains in knowledge among the postgraduates. Adult learning theory suggests that the determinants of learning in the two groups are different, with postgraduate learning tending to be driven by self motivation and relevance to clinical practice, whereas undergraduate learning is generally driven by external factors such as curriculum and examinations.7

Empirical and theoretical research suggests that there is a hierarchy of teaching and learning activities in terms of their educational effectiveness:

  • Level 1: interactive and clinically integrated activities
  • Level 2(a): interactive but classroom-based activities
  • Level 2(b): didactic but clinically integrated activities
  • Level 3: didactic, classroom or standalone teaching.8

Systematic reviews of teaching in EBM has shown that Level 3 activities are effective in improving only the knowledge base, and it is the clinically integrated Level 1 teaching activities that bring about changes in skill, attitude and behaviour.9 Accordingly, in this two-part series we suggest a step-by-step approach to integrate teaching of EBM into everyday clinical practice and training ( Box 1).

Box 1Some general tips for clinically integrating EBM teaching

  • Create an EBM oriented clinical environment
  • Emphasize the need for ‘EBM literacy’ in trainees' appraisals and evaluate it at the time of assessment
  • Role models are essential: unless trainees see their role models use EBM in practice, they are unlikely to value it as clinically important
  • Trainees' day-to-day clinical work-based education should refer to relevant trials and systematic reviews to show how research integrates into clinical practice; EBM teaching may therefore need to focus as much on teachers as on trainees
  • Create learning opportunities (e.g. journal clubs, case discussions, morbidity/mortality meetings) to engage trainees in communal EBM activity

Step 1: formulating a clinical question

The first step in using EBM is identification of a knowledge gap, followed by formulation of a clinical question. Box 2 provides a series of tips for teaching question framing skills to trainees. The traditional method of bedside teaching involves exploring trainees' knowledge and exposing any deficiencies.

Box 3Tips for teaching question framing

  • If trainees do not know the answers to questions, do not blame and shame: instead, use these knowledge gaps as learning opportunities
  • Encourage trainees to use the PICO structure to formulate questions
  • Select topics for question framing where you already know that good evidence exists to address the question (e.g. steroids for preterm birth, magnesium sulphate for pre-eclampsia / eclampsia)
  • Frame questions in any clinical setting: ward round, outpatient clinics, delivery suite or in theatre, and use educational prescriptions as a routine part of the round
  • Always close the learning loop by asking trainees to give answers to the question generated at some time in not too distant future

More compassionate teachers provide ready-made answers to fill the knowledge gaps before moving to the next patient. This makes matters worse: passive learning, or spoon-feeding of knowledge, is short term, with the information being lost soon after the ward round. How can we change this cycle of inefficient teaching and learning? Start by identifying knowledge gaps as learning opportunities. To have a lasting impact, senior clinicians should act as facilitators, encouraging trainees to learn actively. They should consciously foster a clinical environment where exposure of a knowledge gap is a cue for initiating the EBM steps outlined in Box 2. As mentioned in the principles of adult learning theory, this should be done sensitively, so that as adult learners, the trainees see this as step towards their own professional development rather than as a judgemental or belittling exercise.

Box 2Elements of EBM for clinical integration

  • Formulating clinical questions (identifying knowledge gap)
  • Tracking down the best evidence with which to answer that question (with input from clinical librarian if necessary)
  • Critically appraising that evidence for its validity (closeness to the truth), impact (size of the effect), reliability (precision) and applicability (potential for improving outcomes in clinical practice)
  • Integrating the findings of critical appraisal with clinical judgement taking into account clinical circumstances, choices and values of individual patients
  • Bringing about change – implementing the evidence from research into practice by executing the preceding four steps and seeking ways to improve evidence-based practice through audit

The first step in evidence-based practice is to recognize that identification of a knowledge gap should lead the trainee to actively seek answers. Initiating this process is not easy. The teacher needs to help construct an answerable question, as without a clearly focused question, it is extremely difficult to find clinically useful answers to help with patient care. Questions arise regularly in busy clinical practice but they don't always get followed up because of clinical commitments or time constraints. Teachers need an approach that rapidly documents the questions and allocates it to the trainees to work on in their own time.

Figure 1 shows a PICO (Population, Intervention, Comparison, Outcome) structure, which is a useful tool that can be used quite easily to formulate a clearly focused clinical question. Moreover, the same tool can be used with advantage to follow the search strategy, as described in the next section.

Figure 1
PICO Structure

PICO structures can be used as educational prescriptions which can be quickly handed over to the trainees during a ward round for them to follow up when an opportunity arises. For example, post-call rounds are a rich environment for producing clinical questions but the trainees are usually too exhausted during these rounds for this type of exercise. In those circumstances educational prescriptions can be used to record the question for ‘filling’ at a later date. An educational prescription not only specifies the clinical problem that generated the question but it also states the question, in all of its key elements. It sets a time frame (taking into account the urgency of the clinical problem) and specifies who is responsible for answering it. A sample of educational prescription is available at http://www.cebm.utoronto.ca/doc/edupres.pdf.10

Step 2: looking for evidence / searching for answers

Once trainees learn how to formulate a question, they need to learn what to search for, and how and importantly where to search for it. As we have seen in the principles of adult learning theory, trainees may come with varying degrees of prior experience of searching the evidence base. Searching for evidence is an art which trainees need to learn in order to get the optimum result by best using the limited time available. Teaching this art to trainees should be individualized, as those with prior experience may only need limited guidance whereas the less experienced need more in-depth teaching. Moreover, more experienced trainees should be encouraged to guide and help their peers in small group activities.

What to look for

Trainees should be encouraged to find evidence summaries or guidelines issued by respectable professional organizations and prepared after taking into consideration the highest quality evidence available. Where such guidelines are not available, they should be advised to find relevant systematic reviews before turning to primary research. Systematic reviews focus on a single question, and identify, appraise, select and synthesize all randomized controlled trials relevant to the question. These are considered to be the highest level of evidence. Where there are no guidelines, evidence summaries or systematic reviews, individual studies should be searched going down the evidence levels ( Table 1).

Table 1
Classification of evidence levels. Adapted from RCOG guidelines.11

How to look for it

It becomes extremely laborious and time consuming to look into available evidence without an effective search strategy. From the PICO structure, a number of terms can be selected for the search under each separate heading. There are two types of terms: natural language terms and terms from a controlled vocabulary. The latter are used in some databases to describe or index articles registered in the database; in the Medline database, for example, these index terms are called Medical Subject Headings (MeSH). The selected terms can then be combined with ‘OR’ or ‘AND’ – in this context, these are called Boolean operators. OR should be used to combine synonyms (terms with similar or related meaning); while AND should be used to combine headings ( Figure 2).

Figure 2
Formulating a search strategy from PICO

Where to search

Systematic reviews can be found in databases such as the Cochrane Library and Medline. The Cochrane Database of Systematic Reviews (CDSR) is a collection of systematic reviews produced by the Cochrane Collaboration with explicit methodology and editorial procedures.

PubMed is another widely used search engine; however, without the proper skills searching PubMed can be a frustrating experience. Trainees should be encouraged to use the Clinical Queries section for more rewarding and time-efficient results; this is a short-cut built into the database which searches for systematic reviews. Box 4 provides a variety of search tips for trainees.

Box 4Tips for teaching searching for the evidence

  • Encourage trainees to contact local medical librarian at an early stage
  • Rather than telling them what evidences exist, ask them to find relevant guidelines, systematic reviews or primary studies of high quality using the internet (including RCOG website), the Cochrane Library, the Reproductive Health Library, PubMed.
  • Ask the trainees to write down a search strategy using free text and controlled text language with Boolean operators to define terms relevant to the question. Encourage trainees to define a search strategy by using PICO structure.
  • Ask the trainees to document the searches and their yield
  • Ask the trainees to select relevant citations and to retrieve papers.
  • Always close the learning loop. It can be assessed with advantage during case based discussion (CbD) that the trainees need to do as part of their assessment.
  • Acknowledge their effort and congratulate them on their progress.

Step 3: critically appraise the evidence

Critical appraisal is the first step in transferring research knowledge into practice. It involves systematically examining research evidence to assess its validity, results, relevance, impact and applicability before using it to inform a decision. Randomized controlled trials and systematic reviews are the highest levels of evidence but they are not automatically of good quality and should always be appraised critically.

There is a misconception that critical appraisal is too mathematical and remote from clinical practice. One of the main challenges of teaching EBM is the fact that it involves basic principles of epidemiology and statistics, both repellent to many doctors.3 But it need not necessarily be so; the facilitator should always be aware of this attitude, and encourage trainees to overcome their inhibitions. From the adult learning theories we know that adults will commit to learning when the goals and objectives are considered realistic and important to them. Trainees as adult learners need to see that learning and their day-to-day activities are related, and that the learning is relevant. Setting up a journal club with the aim of presenting critically appraised topics or articles could be of great help at this stage. The experience of observing their colleagues critically appraising a report with confidence may act as a great incentive, not only to learn so that they can present a paper themselves, but also to see that critical appraisal is an achievable aim.

Three broad issues need to be considered when appraising research articles:

  1. Are the results valid?
  2. What do the results show?
  3. Are the results relevant?

Are the results of the study valid?

There are numerous quality checklists available but the following questions are the essential ones to ask of any randomized controlled trial:

  • Is the assignment of patients randomized?
  • Is the randomization concealed?
  • Aside from the experimental intervention, are the groups treated equally?
  • Are patients, health workers and study personnel ‘blind’ to treatment?
  • Are patients analysed in the groups to which they were randomized?
  • Is follow-up complete?

The questions to consider for systematic reviews are:

  • Types of articles reviewed
  • Databases searched
  • Specific criteria used for searching
  • Completeness of search
  • Inclusion/exclusion criteria for the studies
  • Inclusion of published and unpublished data
  • Homogeneity assessment

A poorly conducted study which scores badly in these questions loses its validity and the results may not be worth considering. Trainees should be encouraged to go through a quality assessment checklist first and only look into the results if the answers are satisfactory.

What are the results?

How large is the treatment effect?

The chances of a particular outcome occurring in an individual is called the risk event (risk may refer to something good or bad). By comparing the risk event between the experimental and the control group, a measure of relative risk can be calculated. A relative risk of 1 occurs when the incidences are the same in both groups. If the intervention is expected to lead to more of the outcome measured (e.g. reduction of symptoms) then relative risk would be more than 1; if the intervention is expected to lead to less of the outcome measured (e.g. complication) then the relative risk would be less than 1. The effectiveness of a therapy can be expressed with a relative risk.

How precise is the estimate of treatment effect?

There will always be some doubt about the result (or best estimate), as a trial only looks at a sample population. The confidence interval indicates the range of doubt around the best estimate. Results are often presented as P values. These describe the probability that a particular result has occurred by chance. If P<0.05, this is described as statistically significant: the results are unlikely to have happened by chance.

‘Odds’ refers to the ratio of the number of people experiencing an event to the number not experiencing the event. The odds ratio compares the ratio of the odds for people in an experimental group versus those in a control group; values greater than 1 indicate that intervention is better than control. If a 95% confidence interval is calculated, statistical significance is assumed if the interval does not include 1.

Table 2a shows a two-by-two table used to compare experimental and control groups' outcomes, while Table 2b provides details of the statistical calculations which can be performed using the data in Table 2a.

Table 2a
A two-by-two table
Table 2b
Statistics and derived formulae from the two-by-two table

Converting research results into clinically understandable concept

Number needed to treat (NNT) is a popular measure of effectiveness of interventions as they are much easier to comprehend than some statistical descriptions. NNT can be calculated from raw data using a formula, from published odds ratios (using tables), or from relative risk reduction and expected prevalence (using a nomogram). Another way of calculating NNT is to use absolute risk reduction (ARR), which can be derived easily from a two-by-two table, such as that in Table 2a. NNT can be calculated from the ARR using the formula NNT = 1/ARR.

The 95% confidence intervals of the NNT are an indication that 19 times out of 20, the ‘true’ value will be in the specified range.

Making sense of meta-analysis in a systematic review: Forest plot

In a systematic review, meta-analysis is usually presented as a Forest plot ( Figure 3) that can be readily interpreted with these concepts. The Forest plot allows readers to see the information from the individual studies that went into the meta-analysis at a glance. It provides a simple visual representation of the amount of variation between the results of the studies, as well as an estimate of the overall result of all the studies together.13

Figure 3
An example of a Forest plot: the effect of steroids in preventing respiratory distress syndrome (RDS). Adapted from Roberts & Dalziel.12

The left-most column lists the studies (identified by the name of the first author and the year of publication), while the next two columns present the number of events in the intervention and control group for each included study. In a typical Forest plot, the results (point estimates) of included studies are shown as squares of the result of each study. The point estimates are usually presented as relative risk (RR) or odds ratio (OR). A horizontal line runs through the square to show its confidence interval, most commonly a 95% confidence interval.

The combined estimate from the meta-analysis and its confidence interval are put at the bottom, represented as a diamond. The centre of the diamond represents the combined point estimate, and its horizontal tips represent the confidence interval. The vertical line in the middle is the line of no effect. If a point estimate is on the line of no effect (i.e. relative risk 1), no difference in treatment effect between the two comparison groups has been shown in that study. If the diamond – the combined result – crosses the line of no effect, the result is not statistically significant (i.e. the treatment is no more effective than the control). Significance is achieved if the diamond is clear of the line of no effect. The horizontal line at the bottom is the scale measuring the overall treatment effect.

Are the results relevant?

A study may be of good quality and the results may also be significant, but are they really relevant: do they apply to the patient or population under consideration? Could the population sample or the practice covered by the review be different from local population or practice? The senior clinician should always ask the trainees this question to evaluate applicability of the research finding to the clinical care of an individual patient or to the local population.

Conclusion

In this article we have suggested some practical points to facilitate teaching of the first three steps of EBM (i.e. formulating a clinical question, searching for evidence and critical appraisal; Box 5). We have shown that it is possible to teach trainees the elements of EBM within the existing training and educational infrastructure by thoughtful application of the principles of adult learning theory. In the second article of this series, we shall explore strategies that may be helpful in teaching steps 4 and 5 of EBM – the clinical integration of evidence into everyday clinical practice.

Box 5Tips for teaching critical appraisal

  1. Break elements of critical appraisal into three steps:
    1. Is the study valid?
    2. Are the results significant?
    3. Are the findings relevant?
  2. Break the habit of looking into the result before assessing quality of the research
  3. Help the trainee learn to work with figures
  4. Emphasize that rather than being a mathematical or statistical exercise, it is actually a method of bridging the gap between research and practice
  5. Allow sufficient time for the trainee to grasp the statistical concepts
  6. Always ask trainees to calculate NNT in order to derive a clinically understandable message
  7. Always close the learning loop; use the journal club as a platform for critical appraisal
  8. Avoid information overload
  9. Acknowledge their effort and congratulate them on their progress.

Practice points

  1. Every healthcare professional should have a basic understanding of evidence based medicine
  2. EBM should be part of trainees' regular educational activities
  3. Clinically integrated teaching of EBM is more likely to bring about changes in skill, attitude and behaviour
  4. Principles of adult learning should be applied to the teaching of EBM to trainees
  5. With a focused approach and proper guidance it is possible to teach and learn EBM even in a busy clinical setting
  6. Learning loops should always be closed for an optimum outcome.

Footnotes

DECLARATIONS —

Competing interests None declared

Funding None

Ethical approval Not required

Guarantor KD

Contributorship KD, SM and KSK jointly conceived the project. All of them contributed to the draft of the papers with KD contributing majority of Part 1 and SM contributing majority of Part 2. KSK supervised the project

Acknowledgements

None

References

1. Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn't. BMJ. 1996;312:71–2. [PMC free article] [PubMed]
2. NHS Executive. Promoting clinical effectiveness: a framework for action in and through the NHS. Leeds: Department of Health; 1996.
3. Del Mar C, Glasziou P, Mayer D. Teaching evidence based medicine. BMJ. 2004;329:989–90. [PMC free article] [PubMed]
4. Guyatt GH, Meade MO, Jaeschke RZ, Cook DJ, Haynes RB. Practitioners of evidence based care. BMJ. 2000;320:954–5. [PMC free article] [PubMed]
5. Dawes M, Summerskill W, Glasziou P, et al. Sicily statement on evidence-based practice. BMC Medical Education. 2005;5:1. [PMC free article] [PubMed]
6. Speck M. Best Practice in Professional Development for Sustained Educational Change. ERS Spectrum. 1996 Spring;:33–41.
7. Knowles MS, Downie CM, Basford P. Teaching and assessing in clinical practice. London: University of Greenwich; 1998. pp. 23–38.
8. Khan KS, Coomarasamy A. A hierarchy of effective teaching and learning to acquire competence in evidenced-based medicine. BMC Medical Education. 2006;6:59. [PMC free article] [PubMed]
9. Coomarasamy A, Khan KS. What's the evidence that postgraduate teaching in evidence-based medicine changes anything? A systematic review. BMJ. 2004;329:1017–9. [PMC free article] [PubMed]
10. Educational prescription: http://www.cebm.utoronto.ca/doc/edupres.pdf
11. Royal College of Obstetricians & Gynaecologists Green Top Guidelines. Classification of evidence levels. London: RCOG;
12. Roberts D, Dalziel S. Antenatal corticosteroids for accelerating fetal lung maturation for women at risk of preterm birth. Cochrane Database of Systematic Reviews. 2006;3 CD004454. [PubMed]
13. Lewis S, Clarke M. Forest plots: trying to see the wood and the trees. BMJ. 2001;322:1479–80. [PMC free article] [PubMed]

Articles from Journal of the Royal Society of Medicine are provided here courtesy of Royal Society of Medicine Press
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...