Logo of bmjLink to Publisher's site
BMJ. Jan 22, 2000; 320(7229): 224–230.

Evaluation of the effectiveness of an educational intervention for general practitioners in adolescent health care: randomised controlled trial

L A Sanci, fellow in adolescent health,a C M M Coffey, epidemiologist,a F C M Veit, adolescent physician,a M Carr-Gregg, director of training and education,a G C Patton, director,a N Day, principal research fellow,b and G Bowes, professorial fellowa



To evaluate the effectiveness of an educational intervention in adolescent health designed for general practitioners in accordance with evidence based practice in continuing medical education.


Randomised controlled trial with baseline testing and follow up at seven and 13 months.


Local communities in metropolitan Melbourne, Australia.


108 self selected general practitioners.


A multifaceted educational programme for 2.5 hours a week over six weeks on the principles of adolescent health care followed six weeks later by a two hour session of case discussion and debriefing.

Outcome measures

Objective ratings of consultations with standardised adolescent patients recorded on videotape. Questionnaires completed by the general practitioners were used to measure their knowledge, skill, and self perceived competency, satisfaction with the programme, and self reported change in practice.


103 of 108 (95%) doctors completed all phases of the intervention and evaluation protocol. The intervention group showed significantly greater improvements in all outcomes than the control group at the seven month follow up except for the rapport and satisfaction rating by the standardised patients. 104 (96%) participants found the programme appropriate and relevant. At the 13 month follow up most improvements were sustained, the confidentiality rating by the standardised patients decreased slightly, and the objective assessment of competence further improved. 106 (98%) participants reported a change in practice attributable to the intervention.


General practitioners were willing to complete continuing medical education in adolescent health care and its evaluation. The design of the intervention using evidence based educational strategies proved an effective and quick way to achieve sustainable and large improvements in knowledge, skill, and self perceived competency.

Key messages

  • Firm evidence shows that the confidence, knowledge, and skills of doctors in adolescent health contribute to barriers in delivering health care to youth
  • Evidence based strategies in continuing medical education were used in the design of a training programme to address the needs of doctors and youth
  • The programme covered adolescent development, consultation and communication skills, health risk screening, health promotion, risk assessment of depression and suicide, and issues in management of psychosocial health risk including interdisciplinary approaches to care
  • Most interested doctors attended and completed the 15 hour training programme over six weeks and the evaluation protocol covering 13 months
  • Doctors completing the training had substantial gains in knowledge, clinical skills, and self perceived competency than the controls; these gains were sustained at 12 months and were further improved in the objective measure of clinical competence in conducting a psychosocial interview


The patterns of health need in youth have changed noticeably over the past three decades. Studies in the United Kingdom, North America, and Australia have shown that young people experience barriers to health services.15 With the increase in a range of youth health problems, such as depression, eating disorders, drug and alcohol use, unplanned pregnancy, chronic illness, and suicide, there is a need to improve the accessibility and quality of health services to youth.3,6

In the Australian healthcare system general practitioners provide the most accessible primary health care for adolescents.7 Yet Veit et al surveyed 1000 Victorian general practitioners and found that 80% reported inadequate undergraduate training in consultation skills and psychosocial diseases in adolescents and 87% wanted continuing medical education in these areas.4,8 These findings agreed with comparable overseas studies.911

Evidence based strategies in helping doctors learn and change practice are at the forefront of the design of continuing medical education.1214 In response to the identified gap in training an evidence based educational intervention was designed to improve the knowledge, skill, and self perceived competency of general practitioners in adolescent health. We conducted a randomised controlled trial to evaluate the intervention, with follow up at seven and 13 months after the baseline assessment.

Participants and methods

The divisions of general practice are regional organisations that survey the needs of, and provide education for, general practitioners in their zone. There are 15 divisions in metropolitan Melbourne. Advertisements inviting participation in our trial were placed in 14 of the 15 divisional and state college newsletters and mailed individually to all division members. The course was free, and continuing medical education points were available. Respondents were sent details of the intervention and the evaluation protocol and asked to return a signed consent form. Divisions and doctors were excluded if they had previously received a course in adolescent health from this institution.


Consenting doctors were grouped into eight geographical clusters by practice location to minimise contamination and to maximise efficiency of the delivery of the intervention. Clusters (classes) of similar size were randomised to intervention or control by an independent researcher.


The box details the objectives, content, and instructional design of the multifaceted intervention. A panel comprising young people, general practitioners, college education and quality assurance staff, adolescent health experts, and a state youth and family government officer gave advice on the design.15 The curriculum included evidence based primary and secondary educational strategies such as role play with feedback, modeling practice with opinion leaders, and the use of checklists.12,16 The six week programme was delivered concurrently by LS, starting one month after baseline testing (see figure on website).

Goals, content, and instructional design of intervention in principles of adolescent health care for general practitioners

Intervention goals

  • To improve general practitioners' knowledge, skill, and attitudes in the generic concepts of adolescent health to effectively gain rapport with young people, screen them for health risk, and provide health promotion and appropriate management plans
  • To increase awareness of the barriers their practices may pose for youth access and how these may be overcome
  • To understand how other services can contribute to the management of young people and how to access these in their locality

Intervention content (weekly topics)

  • Understanding adolescent development, concerns, and current morbidities, the nature of general practice, and yourself
  • Locating other youth health services and understanding how they work, and medicolegal and ethical issues in dealing with minors
  • Communication and consultation skills and health risk screening
  • Risk assessment of depression and suicide
  • Detection and initial management of eating disorders

Instructional design

 Needs analysis

  • From previous surveys and informally at start of workshops

 Primary educational strategy

Workshops for 2.5 hours weekly for six weeks

  • Debriefing from previous session
  • Brief didactic overviews
  • Group problem based activities and discussion
  • Modeling of interview skills by opinion leaders on instructional video
  • Role play and feedback practice sessions with adolescent actors
  • Activities set to practise in intervening week
  • Individual feedback on precourse evaluation video

Course book

  • Goals, objectives, course requirements, and notes
  • Suggested further reading
  • Class or home activities with rationale for each

Resource book

  • Reading material expanding on workshop sessions

 Practice reinforcing and enabling strategies

  • Adolescent assessment chart for patient audit
  • Logbook for reflection on experience with the patients audited
  • Self assembled list of adolescent health services in local community
  • Availabilty of tutor (LS) by phone for professional support between workshops
  • Refresher session for group discussion of experiences in practice (six weeks after course)


Table Table11 summarises the instruments used in the evaluation. Parallel strategies of objective and self reported ratings of knowledge, skill, and competency were used to ensure findings were consistent.17,18 Participants' satisfaction with the course and their self reported change in practice were evaluated at 13 months. Any other training or education obtained in adolescent health or related areas were noted.

Table 1
Evaluation measures, their content, inter item reliability, and intraclass correlation within randomisation groups estimated at baseline

Clinical skills

Seven female drama students were trained to simulate a depressed 15 year old exhibiting health risk behaviour. Case details and performances were standardised according to published protocols1921 and varied for each testing period. Doctors were given 30 minutes to interview the patient in a consulting room at this institution. An unattended camera recorded the consultation on videotape.

The standardised patients were trained in the use of a validated rating chart21 assessing their own rapport and satisfaction and discussion about confidentiality. These were completed after the interview while still in role. They were blind to the intervention status of the doctors, and no doctor had the same patient for successive interviews.

Two independent observers, blind to participants' status, assessed the taped consultations in the three testing periods. A doctor in adolescent health coded three items in the scale relating to medical decision making. A trained non-medical researcher assessed all other items. The chart was developed from two validated instruments for assessment of adolescent consultations21 and general practice consultations.22,23 Marks for both competency and content of the health risk assessment were summarised into a percentage score. The same observers were used in all three testing periods.

Self perceived competency

Two questionnaires were developed for the doctors to rate both their comfort and their knowledge or skill with process issues, including the clinical approach to adolescents and their families and with substantive issues of depression, suicide risk assessment, alcohol and drug issues, eating disorders, sexual history taking, and sexual abuse. Doctors also rated their consultation with the standardised patient on a validated chart,21 itemising their self perceived knowledge and skill.


Knowledge was assessed with short answer and multiple choice items developed to reflect the workshop topics. The items were pretested and refined for contextual and content validity. The course tutor, blind to group status, awarded a summary score.


Statistical analysis was performed with stata (Stata, Texas), with the individual as the unit of analysis. Factor analysis with varimax rotation was used to identify two domains within the comfort and self perceived knowledge or skill items: process and substantive issues. The internal consistency for all scales was estimated using Crohnbach's α. Reproducibility within and between raters was estimated with one way analysis of variance as was the intraclass correlation of baseline score within each teaching group.

The effect of this intervention was evaluated by regression of gain scores (score at seven month follow up minus baseline score) on the intervention status, with adjustment for baseline and potential confounding variables. Robust standard errors were used to allow for randomisation by cluster. The sustainability of outcome changes in the intervention group between the assessments at seven months and 13 months was evaluated with paired t tests.



Newsletters and mailed advertisements to 2415 general practitioners resulted in 264 expressions of interest. Overall, 139 doctors gave written consent to be randomised. Attrition after notification of study status left 55 (73%) doctors in the intervention group and 53 (83%) in the control group, with an average of 13.5 (12 to 15) doctors in each class.

The age and country of graduation of the doctors in this study were similar to the national workforce of general practitioners.24,25 Female doctors were overrepresented (50% in this study versus 19% and 33% in the other reports). Table Table22 describes the randomisation groups. There was imbalance in age, gender, languages other than English spoken, average weekly hours of consulting, types of practice, and college examinations.

Table 2
Demographic characteristics of general practitioners by intervention group. Numbers are percentages


One doctor dropped out of the intervention group. Overall, 44 doctors attended all six tutorials, eight missed one, and two missed three. In total, 103 of 108 (95%) of participants at baseline completed the entire evaluation protocol (see website).


The evaluation scales showed satisfactory internal consistency and low association with class membership (table (table1).1). Satisfactory interrater agreement was achieved on the competency scale (n=70, r=0.70). The intrarater consistency for both medical and non-medical raters was also satisfactory (n=20, r=0.80 and 0.91 respectively).

Effect of the intervention

Table Table33 describes the baseline measures and the effect of the intervention at the seven month follow up. All analyses were adjusted for age, gender, languages other than English, average weekly hours of consulting, practice type, and college examinations. Doctors reporting education in related areas during follow up (67% control (34 of 51), 41% intervention (22 of 54)) were characterised. The difference analysis was adjusted for this extraneous training and baseline score, although the extraneous training did not affect any outcomes. The study groups were similar on all measures at baseline. The intervention group showed significantly greater improvements than the control group at the seven month follow up in all outcomes except the rapport rating by the standardised patients.

Table 3
Multiple regression analyses of baseline and difference in scores on continuous outcome measures evaluating success of educational intervention at seven month follow up. Models include gender, age group, language other than English, type of practice, ...

The contextual validity and applicability of the course was assessed by 48 of 53 doctors and rated positively by 46 (96%).

Follow up of the intervention group at 13 months

The intervention effect was sustained in most measures and further improved in the independent rater's assessment of competence (table (table4).4). The crude rating of the confidentiality discussion by the standardised patients deteriorated at the 13 month assessment but was significantly greater than baseline. Overall, 98% of the participants reported a change in practice, which they attributed to the intervention.

Table 4
Evaluation of change in unadjusted percentage scores for intervention group (n=54) from baseline to seven month follow up and from 7 month to 13 month follow up using paired t tests. Values are mean (95% CI) unless stated otherwise


A course in adolescent health for six sessions designed with evidence based strategies in doctor education brought substantial gains in knowledge, skills, and self perceived competency of the intervention group of doctors compared with the control group, except for the rapport and satisfaction rating by the standardised patients. The changes were generally sustained over 12 months and further improved in the independent observer's rating of competence. Almost all participants reported a change in actual practice since the intervention.

These results are better than reported in a review of 99 randomised controlled trials to evaluate continuing medical education published from 1974-95.12 Although over 60% had positive outcomes they were small to moderate and usually in only one or two outcome measures. In keeping with the recommendations of this review we adapted a rigorous design, clearly defined our target population, and used multiple methods for evaluating competence. Perhaps more importantly the intervention design incorporated three further elements: the use of evidence based educational strategies, a comprehensive preliminary needs analysis, and the content validity of the curriculum ensured by the involvement of both young people and doctors.

The participants clearly represented a highly motivated group of doctors. This self selection bias was unavoidable but reflected the reality that only interested doctors would desire special skills in this domain and conforms to the adult learning principle of providing education where there is a self perceived need and desire for training.12,26,27 We therefore established that the intervention is effective with motivated doctors. It is generally accepted that doctors with an interest in a topic would already have high levels of knowledge and skill, with little scope for improvement. This was not the case in our study. Baseline measures were often low and improvements were large, confirming the need for professional development in adolescent health. The retention rate was excellent and possibly due, in part, to the role of a doctor in the design of the programme, in recruitment, and in tutoring.

Doubt remains as to whether improved competency in a controlled test setting translates to improved performance in clinical practice.28 High competency ratings are not necessarily associated with high performance, but low competency is usually associated with low performance.16,29,30

The rapport and satisfaction rating by the standardised patients was the only outcome measure apparently unresponsive to the intervention. Actors' ratings and character portrayal were standardised, and gender bias was controlled by using only actresses. Even with these precautions three actresses scored differently from the rest, one had fewer physician encounters, and the subjective nature of the rating scale probably contributed to large individual variation. A trend towards improvement in the intervention group was noted but our study lacked sufficient power to find a difference. In other settings validity and reliability in competency assessments with standardised patients has been shown to increase with the number of consultations examined.31,32 Pragmatically, it was not feasible to measure multiple consultations in our study.

Errors in interrater measurement were minimised by using the same raters for all three periods of testing. The independent observer and patient were blind to study status but may have recognised the intervention group at the seven month follow up because of the learnt consultation styles. Other measures of competency were included to accommodate this unavoidable source of error.

Our study shows the potential of doctors to respond to the changing health needs of youth after brief training based on a needs analysis and best evidence based educational practice. Further study should address the extent to which these changes in doctors' competence translate to health gain for their young patients.


We thank the participating doctors, Helen Cahill (Youth Research Centre, Melbourne University), Dr David Rosen (University of Michigan), and Sarah Croucher (Centre for Adolescent Health).


Funding: The Royal Australian College of General Practitioners Trainee Scholarship and Research Fund and the National Health and Medical Research Council.

Competing interests: None declared.


1. Donovan C, Mellanby AR, Jacobson LD, Taylor B, Tripp JH. Teenagers' views on the general practice consultation and provision of contraception. The adolescent working group. Br J Gen Pract. 1997;47:715–718. [PMC free article] [PubMed]
2. Oppong-Odiseng ACK, Heycock EC. Adolescent health services—through their eyes. Arch Dis Child. 1997;77:115–119. [PMC free article] [PubMed]
3. Ginsburg KR, Slap GB. Unique needs of the teen in the health care setting. Curr Opin Pediatr. 1996;8:333–337. [PubMed]
4. Veit FCM, Sanci LA, Young DYL, Bowes G. Adolescent health care: perspectives of Victorian general practitioners. Med J Aust. 1995;163:16–18. [PubMed]
5. McPherson A, Macfarlane A, Allen J. What do young people want from their GP? [Letter] Br J Gen Pract. 1996;46:627. [PMC free article] [PubMed]
6. Bearinger LH, Gephart J. Interdisciplinary education in adolescent health. J Paediatr Child Health. 1993;29:10–5S. [PubMed]
7. Bennett DL. Adolescent health in Australia: an overview of needs and approaches to care. Sydney: Australian Medical Association; 1984.
8. Veit FCM, Sanci LA, Coffey CMM, Young DYL, Bowes G. Barriers to effective primary health care for adolescents. Med J Aust. 1996;165:131–133. [PubMed]
9. Blum R. Physicians' assessment of deficiencies and desire for training in adolescent care. J Med Educ. 1987;62:401–407. [PubMed]
10. Blum RW, Bearinger LH. Knowledge and attitudes of health professionals toward adolescent health care. J Adolesc Health Care. 1990;11:289–294. [PubMed]
11. Resnick MD, Bearinger L, Blum R. Physician attitudes and approaches to the problems of youth. Pediatr Ann. 1986;15:799–807. [PubMed]
12. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700–705. [PubMed]
13. Davis DA, Thomson MA, Oxman AD, Haynes RB. Evidence for the effectiveness of CME. JAMA. 1992;268:1111–1117. [PubMed]
14. Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. Can Med Assoc J. 1995;153:1423–1431. [PMC free article] [PubMed]
15. Owen JM. Program evaluation forms and approaches. St Leonards, NSW: Allen and Unwin; 1993.
16. Davis D, Fox R. The physician as learner. Linking research to practice. Chicago, IL: American Medical Association; 1994.
17. Greene JC, Caracelli VJ. Advances in mixed-method evaluation: the challenges and benefits of integrating diverse paradigms. New directions for evaluation, No 74. San Francisco: Jossey-Bass; 1997.
18. Masters GN, McCurry D. Competency-based assessment in the professions. Canberra: Australian Government Publishing Service; 1990.
19. Norman GR, Neufeld VR, Walsh A, Woodward CA, McConvey GA. Measuring physicians' performances by using simulated patients. J Med Educ. 1985;60:925–934. [PubMed]
20. Woodward CA, McConvey GA, Neufeld V, Norman GR, Walsh A. Measurement of physician performance by standardized patients. Refining techniques for undetected entry in physicians' offices. Med Care. 1985;23:1019–1027. [PubMed]
21. Rosen D. The adolescent interview project. In: Johnson J, editor. Adolescent medicine residency training resources. Elk Grove Village, IL: American Academy of Pediatrics; 1995. pp. 1–15.
22. The Royal Australian College of General Practitioners College examination handbook for candidates 1996. South Melbourne: Royal Australian College of General Practitioners; 1996.
23. Hays RB, van der Vleuten C, Fabb WE, Spike NA. Longitudinal reliability of the Royal Australian College of General Practitioners certification examination. Med Educ. 1995;29:317–321. [PubMed]
24. Bridges-Webb C, Britt H, Miles DA, Neary S, Charles J, Traynor V. Morbidity and treatment in general practice in Australia 1990-1991. Med J Aust. 1992;157:1–57S.
25. The general practices profile study. A national survey of Australian general practices. Clifton Hill, Victoria: Campbell Research and Consulting; 1997.
26. Knowles M. The adult learner. A neglected species. Houston, TX: Gulf; 1990.
27. Ward J. Continuing medical education. Part 2. Needs assessment in continuing medical education. Med J Aust. 1988;148:77–80. [PubMed]
28. Norman GR. Defining competence: a methodological review. In: Neufeld VR, Norman GR, editors. Assessing clinical competence. New York, NY: Springer; 1985. pp. 15–35.
29. Rethans JJ, Strumans F, Drop R, van der Vleuten C, Hobus P. Does competence of general practitioners predict their performance? Comparison between examination setting and actual practice. BMJ. 1991;303:1377–1380. [PMC free article] [PubMed]
30. Pieters HM, Touw-Otten FWWM, De Melker RA. Simulated patients in assessing consultation skills of trainees in general practice vocational training: a validity study. Med Educ. 1994;28:226–233. [PubMed]
31. Colliver JA, Swartz MH. Assessing clinical performance with standardized patients. JAMA. 1997;278:790–791. [PubMed]
32. Colliver JA. Validation of standardized-patient assessment: a meaning for clinical competence. Acad Med. 1995;70:1062–1064. [PubMed]
Jan 22, 2000; 320(7229): 224–230.

Commentary: Applying the BMJ's guidelines on educational interventions

Jean Ker, lecturer in medical education

In the western world, healthcare systems are facing enormous changes driven by both political and economic forces and by the increase in consumer expectations for competent and consistent quality health care. In response to these changes, medical education has become an increasingly important aspect of every doctors' professional life. Publishers have responded by including papers on medical educational issues with increasing frequency. This move has, however, required the development of guidelines to evaluate papers on educational interventions.

This critique applies guidelines developed by the BMJ's education group, which were published in the BMJ on 8 May 1999.

Guideline 1: General overview

The commitment of the BMJ to publish more educational research makes the paper by Sanci et al an eminently suitable one for practising doctors interested in medical education.

Adolescent health care is challenging not only for general practitioners but for healthcare professionals involved in service delivery at all levels. This paper shows how successfully continuing medical education can be incorporated into changes in service delivery.

The principle steps of the educational intervention process are clearly outlined and can be generalised to other clinical settings, making it of interest to a wide readership. It contributes to the growing literature on evaluation of educational interventions in the general practice setting by attempting to show sustained changes in practice performance after a brief programme for continuing medical education.

The paper also follows the general style and guidelines for publication in the BMJ.

Guideline 2: Theoretical considerations

One of the purposes of the guidelines on evaluating educational interventions is to facilitate, through papers, readers' understanding of the teaching and learning process so that they can apply any relevant aspects to their own practice.

In relation to this, the goals of this educational intervention are well described in the context of Australian general practice. The educational rationale was, however, rather brief in its explanation. An expanded discussion on the strategies used could have covered advantages and disadvantages. Readers may be able to utilise some of the learning opportunities given, but their links to the goals were not explicit.

Guideline 3: Study presentation and design

A panel of stakeholders, including patients, was used to identify the content and design of the multifaceted intervention, which ensures the relevance of the intervention in terms of healthcare practice, and this was described in detail. The study design to ensure that standardised patients and observers were blind to the intervention status of the doctors is commendable.

In answering the questions posed in the guidelines some concerns with the design are raised.

The study is described as a randomised controlled study. A better and less misleading description would have been to describe it only as a randomised study, as it is often difficult to eliminate contaminants in an educational intervention. In fact the bias described in the type of practice, the language spoken, the age differences, as well as the college exams taken, does question the positive outcomes reported in the study.

The lack of a pretest to identify whether the two groups were comparable in terms of knowledge does also bring into question the final interpretation of the intervention. Purposive sampling based on a pretest and the variables described above would have been more appropriate and would have lent more meaning to the outcome.

The statistical analysis is clearly shared with the reader and well described. The use of a multifaceted evaluation system using recognised validated instruments reflects the guidelines for evaluating papers on educational interventions.

Guideline 4: Discussion

The discussion was structured in accordance with the guidelines, with a clear statement of the principle findings. The sustainability of the intervention could, however, have been highlighted as it was a significant finding. The strengths and weaknesses of the study in relation to selection bias were well debated and justified.

The discussion in relation to other studies was, however, only briefly addressed, referring to only one systematic review of strategies for continuing medical education. This could have been expanded to support some of the findings, particularly in relation to the rapport and satisfaction of the standardised patients as a measurement of outcome.

The discussion did not begin to explore the implications for clinicians other than to indicate a need for assessing the health gain for patients from such interventions but did not discuss the difficulties of cost benefit analysis.

The guidelines on evaluating educational interventions as applied to this paper enabled the reviewer to systematically address all relevant aspects of the intervention. What is not clear is how much weighting should be placed on each guideline in relation to deciding whether the article should be published or not.


website extra: The sample size calculation and a chart showing the flow of participants through the trial appears on the BMJ's website www.bmj.com

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Group
PubReader format: click here to try


Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...


  • PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...