Logo of herLink to Publisher's site
Health Educ Res. 2008 Dec; 23(6): 976–986.
Published online 2008 Jun 16. doi:  10.1093/her/cyn029
PMCID: PMC2583909

Process evaluation results from a school- and community-linked intervention: the Trial of Activity for Adolescent Girls (TAAG)


Process evaluation is a component of intervention research that evaluates whether interventions are delivered and received as intended. Here, we describe the process evaluation results for the Trial of Activity for Adolescent Girls (TAAG) intervention. The intervention consisted of four synergistic components designed to provide supportive school- and community-linked environments to prevent the decline in physical activity in adolescent girls. Process evaluation results indicate that the intervention components were delivered from intervention staff to teachers with high fidelity (84–97%) to the protocol and with lower fidelity (range: 18–93%) from teachers to students. Physical activity programs for girls, a unique feature of the TAAG intervention, increased from a mean of 10 programs per school to a mean of 16 and 15 in years 1 and 2, respectively, in intervention schools, with no change in control schools. These findings suggest that a multicomponent school- and community-based physical activity intervention can be delivered with fidelity and result in a middle school environment that supports physical activity for girls.


Health education and behavior change intervention programs are often complex. They may include multiple components that target individuals, physical and social environments and may be conducted in multiple locations with target populations with unique characteristics and needs. These complex intervention characteristics necessitate the inclusion of a thorough process evaluation that assesses factors, indicating whether an intervention was delivered and received as intended [1, 2]. Process evaluation offers the potential to monitor and assure quality of intervention implementation and provides information on the depth and breadth of intervention implementation and adherence, secular trends and potential contamination of the control group. Typically, process evaluation includes evaluation of the intervention by measuring dose (amount of intervention that was delivered), reach (number of those intended who received the intervention) and fidelity (quality of the intervention that was delivered).

Process evaluation is particularly important in explaining the complexities of school-based interventions by documenting dose, reach and fidelity. In addition, school-based intervention studies often use a train-the-trainer model; that is school personnel are trained by research study staff to deliver interventions in school settings [38]. A train-the-trainer’s model adds challenges to the delivery of the intervention. School personnel may be required to deliver the intervention as part of their job but may not have the commitment to the intervention goals. In addition, these trained school personnel have limited control over the receptivity of interventions by students and their ability to deliver an intervention may be compromised by school district requirements or other factors within and pertaining to the school setting. Teachers’ belief in the intervention, their enthusiasm in intervention delivery and motivational levels, as well as their ability to model the behavior of interest and to present a behavior change curriculum, may contribute to students’ receptivity to the intervention [9].

We describe the process evaluation methods and results for the intervention components of the Trial of Activity for Adolescent Girls (TAAG). We specifically examined dose, fidelity and reach of the first 2 years of intervention.

Overview of TAAG

The National Heart, Lung and Blood Institute (NHLBI) sponsored TAAG, a group-randomized trial of 36 middle schools, to develop and test a school- and community-based intervention to prevent the decline in physical activity in middle school girls [10] building upon insights gained in previous school-based interventions [29, 1114]. TAAG was conducted at six university-based field sites representing diverse geographic locations and populations: Universities of Arizona, Maryland, Minnesota and South Carolina, San Diego State University and Tulane University [17]. The trial was coordinated by the Collaborative Studies Coordinating Center of the University of North Carolina at Chapel Hill in partnership with NHLBI. Outcome results of TAAG are reported elsewhere [18].

TAAG intervention framework and components

The social–ecological model [15] was the conceptual framework that guided the TAAG intervention, which consisted of four major components designed to provide supportive environments to reduce the decline in girls’ physical activity [18]. (i) TAAG physical education (PE): PE teachers attended workshops and received instructional materials and regular on-site support to conduct lessons that encouraged active participation of girls during PE classes and to promote out-of-class physical activity. (ii) TAAG health education: health education, PE, science or homeroom teachers attended workshops and received materials to teach a series of six lessons that promoted development of behavioral skills associated with physical activity. Each health education lesson included an activity challenge (i.e. homework) in which students monitored a behavior and set goals to increase their activity. (iii) TAAG physical activity programs: collaborations were created between schools, community agencies and TAAG university staff to increase girl-focused physical activity programs outside of PE classes. (iv) TAAG promotions: social marketing efforts that included posters, flyers and special activities were launched to encourage overall physical activity and promote TAAG-specific programs. Program champions (i.e. school and/or community staff who took ownership of the program) were recruited and trained during the second intervention year, and they directed the intervention to enhance its sustainability in the third year.

Intervention goals were identified to indicate optimal intervention implementation. Goals varied by component, but essentially were set for 100% fidelity for delivery of the intervention by TAAG staff to teachers and 80% fidelity for delivery by teachers to the students. Fidelity was defined as the consistency between established protocols and implementation. Reach (the level of participation by the target group) was for 100% of girls in the appropriate grades to receive TAAG PE and health education, 60% to participate in the health education activity challenges and for attendance at TAAG physical activity programs to systematically increase by at least 5% each semester.

Process evaluation for TAAG

Process evaluation research for TAAG was theoretically based and designed to take a broad approach, consistent with the purposes outlined by Baranowski and Stables [19] and Steckler and Linnan [20]. In addition to evaluating dose, reach and fidelity, we also assessed environmental factors and used process evaluation for quality control purposes [19, 20]. Specifically, the objectives were to (i) evaluate the implementation, or delivery, of the intervention (i.e. dose and fidelity); (ii) evaluate the extent to which the intervention reached the intended targets and the degree to which the targets were exposed to the intervention components (i.e. reach and exposure); (iii) document environmental factors that may have an influence on intervention effectiveness (i.e. context, contamination and secular trends) and (iv) provide periodic quality control information to intervention planners to refine the intervention for the purpose of optimizing their implementation and effectiveness (e.g. enhance dose, fidelity, reach and exposure). Intervention acceptability predicts continued use of intervention strategies [21]; thus, student enjoyment and teacher acceptability also were assessed.


Quantitative and qualitative methods, including structured observations, questionnaires, semi-structured interviews and completion logs, were used to collect process evaluation data. Instruments were developed iteratively with members of intervention planning groups, similar to the method used in developing TAAG formative assessment instruments [22]. Instruments were tested during the intervention pilot, reliability and validity were determined and instruments were revised prior to the main trial intervention. Measures were constructed to determine the degree to which each intervention component’s objective was met.

TAAG PE and health education

Intervention activities targeted for process evaluation were (i) staff development workshops delivered by TAAG staff and (ii) lesson content delivered by the teachers. Staff development workshops for both PE and health education were evaluated by attendance logs (dose and reach) and workshop observations to assess whether the workshop material was delivered as intended (fidelity). Adaptation of PE classes to meet TAAG objectives and implementation of health education lessons were assessed through structured observations throughout the academic year by TAAG staff and teacher surveys at the end of the school year (dose, fidelity and acceptability). Inter-rater reliability of each item of the lesson observation instruments was kappa = 0.4–1.00. Teacher interviews assessed health education activity challenges’ completion (reach).

TAAG programs for physical activity

The most innovative component of the TAAG intervention was to create links between community agencies, other community members and schools to provide activity programming for girls outside of PE class. To determine the existence and nature of these relationships, interviews were conducted with principals at all schools in the spring of each year. Principals were asked if their school partnered with other groups to provide physical activity programs, and if so, the types of programs that have resulted from the partnership.

The number, type and participation of girls in school-based physical activity programs were documented from two sources. One source included both TAAG and non-TAAG programs in intervention schools, and all programs available to girls in control schools. A survey, adapted from an instrument developed for the Middle School Physical Activity and Nutrition Trial [11], was conducted each spring with sponsors of physical activity programs that were either held at the school site or held off school grounds, but sponsored by the school.

A second data source collected information specifically on TAAG programs in intervention schools. TAAG process evaluators completed forms that documented TAAG programs and included information on program type, duration in weeks, number of sessions per week and session duration (dose). Number of attendees was tallied by the program instructor and given to the process evaluator (reach). No names of attendees were collected. A random sample of TAAG programs was chosen each semester (n = 2 per school in Semester 1, increasing by one program per school each semester), and participants were given an anonymous survey to assess program acceptability (enjoyment), during a session approximately midway through the program.

TAAG promotions

Exposure to promotional materials was assessed through the student questionnaire administered as part of the TAAG measurement protocol. In the spring semester of the second year of the TAAG intervention, 120 eighth grade girls randomly selected from each school were invited to participate in the TAAG main outcome measures (i.e. physical activity and body composition assessment and psychosocial questionnaires, which included questions on exposure to TAAG promotional messages) [16]. Student participation in special events and physical activity promotions were assessed through participation records (reach).


The process evaluation data were comprised of observations, questionnaires, semi-structured interviews and completion logs that describe the characteristics of students, teachers, classes and school or community environments. All analyses took into account the expected positive intraclass correlation among responses for students, teachers and classes in the same school and school- or community-level responses within the same site [23]. SAS Proc Mixed [24] and SAS Proc Glimmix were used to model continuous and dichotomous response measures, respectively, with random effects for school and site to account for the correlated nature of the data. Race was included as a fixed effect for analyses of girl-level data to control for differences in the response measure by race/ethnicity. For all tests, statistical significance was determined at the 0.05 level. All statistical analyses were conducted using SAS software version 9.1.3 (SAS Institute, Cary, NC, USA).


TAAG PE and health education

The first level of intervention implementation was for TAAG university staff to deliver staff development workshops to teachers at the intervention schools. As displayed in Table I, dose, reach and fidelity were high (range 84–97%) in both years. Workshop trainings were attended by >85% of teachers, with the remainder attending make-up sessions. Nearly 90% of the full-day PE and health education workshop content in years 1 and 2 was fully covered by TAAG university staff.

Table I.
Implementation of staff development workshops, years 1 and 2a

Table II displays the dose, acceptability and fidelity of the teacher implemention of TAAG PE. From year 1 to year 2, acceptability of TAAG PE concepts significantly increased in one aspect: amount of change teachers made based on TAAG. Fidelity significantly increased by the second intervention year in four of the seven PE objectives. Intervention goals of at least 80% fidelity were reached for two of the seven objectives measured in year 1 and for three objectives in year 2. Compared with control schools in year 2, intervention schools were more likely to use strategies to minimize management time (P = 0.03).

Table II.
Implementation of TAAG PE by teachers, years 1 and 2, spring semester only

Over 90% of the TAAG health education lessons were taught in both years at all of the schools (Table III). Observations indicated that the lesson components were partially or completely taught during 76 and 64% of observations during years 1 and 2, respectively. Sixty-two percent of the activity challenges were completed by the girls each year, meeting the intervention goal of 60%.

Table III.
Implementation of TAAG health education by teachers, years 1 and 2

TAAG physical activity programs

Based on interviews with principals, at baseline 44% of intervention schools and 44% of control schools reported community collaborations for physical activity programs (data not shown). This increased to 83% of intervention schools at intervention years 1 and 2, with no increase in control schools (Table IV). Based on surveys of physical activity program leaders at intervention and control schools, at baseline there were an average of 10.3 ± 5.4 and 10.2 ± 3.6 in intervention and control schools, respectively (data not shown). The number of physical activity programs was significantly greater in the intervention schools compared with control schools at the end of the first intervention year and approached significance in the second year (Table IV).

Table IV.
Implementation of programs for physical activity intervention component, including school–community collaborations and TAAG programs, Semesters 1–4

The average number of TAAG programs exceeded intervention goals for each semester. Ninety-four percent (17 of 18 schools) met the target number of programs in Semesters 1 and 3, while all schools met the target in Semester 2. In Semester 4, 72% met the target number of programs (13 of 18 schools) (Table IV). Average attendance at each program ranged from 11.5 to 18.1 girls. Across all years, sixth grade girls were most likely to attend. Total attendance declined between Semesters 1 and 2 and increased between Semesters 2 and 3. Based on ∼1300 surveys in year 1 and 2000 surveys in year 2, girls rated the physical activity programs as highly enjoyable.

TAAG promotions

In the first intervention year, the major promotional event was a passport challenge targeting seventh grade girls, in which girls received validation stamps in their ‘passports’ for participating in specific kinds of physical activities. Approximately 22% of seventh grade girls participated, who did not meet the intervention goal of 35%. A pedometer challenge was promoted for eighth grade girls in the second intervention year. About 71% of eighth grade girls participated in this event, which met the intervention target of 70%. Girls from intervention schools were significantly more likely to recognize TAAG promotional messages used in posters and flyers compared with girls from control schools (P < 0.0001) (Table V).

Table V.
Percent of girls reporting exposure to TAAG promotional messages in intervention and control schools at the end of the 2-year intervention


These results are an overview of the comprehensive process evaluation conducted to document the TAAG intervention implementation. School personnel were trained by TAAG interventionists with a high level of fidelity to the protocol and reach approached 100%, with almost all teachers attending intervention trainings. All students were exposed to TAAG PE, which was implemented with moderate to high fidelity. More than three-quarters of the targeted population were taught all TAAG health education lessons. A major thrust of the TAAG intervention was to increase the number of physical activity programs offered for girls, and intervention schools provided more programs than did the control schools. For most schools during most semesters, opportunities for girls to attend physical activity programs before school, during lunch or after school exceeded intervention goals. Eighth grade participation in the pedometer challenge met the goal. TAAG promotional messages were identified by more girls from the intervention schools than from the control schools.

Collaborations with outside agencies doubled in the intervention schools but did not change in the control schools—a clear indication of success of the TAAG physical activity program intervention component. Unlike previous school-based trials [36, 11, 13, 14], TAAG was the first to link schools with communities to provide more opportunities for physical activity. Girls who attended programs overwhelmingly enjoyed them. The process evaluation results clearly indicate that the TAAG approach of providing physical activity opportunities is feasible and acceptable by the girls.

Although some aspects of the intervention were implemented with high fidelity, particularly intervention trainings in which TAAG staff was responsible for implementation, other parts were less completely implemented. TAAG intervention staff was highly motivated to fully implement the intervention to teachers and community workers—it was one of their primary employment responsibilities. On the other hand, teachers and others had competing priorities, such as completing district-required curricula, which may have hindered them from fully implementing the intervention. They also may have had a limited interest in research activities [12]. Implementing some intervention components required them to change their standard teaching practices. For example, providing choice in PE could be perceived as decreasing the amount of control the teachers had over the students in class. These factors can reduce teachers’ motivation to implement an ‘extra’ program, such as TAAG, to its fullest. Thus, the results indicating that fidelity was lower when teachers implemented the intervention were not unexpected.

Even though the teacher-delivered approach is less effective for optimizing fidelity across all intervention components, it is an effective model for maximizing acceptability and sustainability of standardized interventions. Approximately two-thirds to three-fourths of TAAG health education lesson components were completed by teachers, a percentage similar to that found by Marcoux et al. [25] for Sport, Play and Active Recreation for Kids (SPARK). Also similar to our results, the teachers favorably rated the SPARK program [25]. This example underscores the usefulness of taking a comprehensive approach to process evaluation. If only intervention fidelity was assessed, an intervention may be deemed ‘not acceptable’ because the lessons were not partially/fully implemented at the predetermined goal (80%). However, we included assessment of acceptability and learned that the teachers liked the lessons, which is an indicator of continued use [21]. Although fidelity was lower than we would have liked, high acceptability ratings may indicate teacher motivation to sustain intervention programs.

Dose was consistently high across intervention components. The TAAG intervention staff saw that all school personnel attended trainings and worked closely with the teachers to ensure that TAAG PE was implemented and health education lessons were taught. They also played a major role in ensuring physical activity programs were implemented at each school. Reach, however, was more variable. While virtually all students were exposed to PE and there was high reach for the health education lessons, reach was lower for the promotional events and after school programs. These results suggest that reaching students during the regular school day is more effective than before or after school when there are competing time priorities.

As measured by process evaluation data, TAAG intervention goals were completely met for 18 of 56 specified intervention goals over the 2 years. Another 17 goals were within 10% points of meeting goals. Thus, 63% of goals were either met or mostly met. Setting intervention target goals was a difficult process—there was little precedence in the literature to help us determine what level of dose, reach and fidelity were needed to achieve trial goals. In the end, we chose a combination of what we thought would maximize intervention effectiveness and what seemed to be reasonable to achieve. For example, we thought it necessary for all PE teachers who taught girls attend the workshops (i.e. maximize intervention effectiveness). In contrast, we set the goal of 60% of girls to complete activity challenges (i.e. reasonable to achieve). Some targets that were not met were those that required the girls to do something outside of their regular school day. It is a continuing challenge to identify programmatic physical activity opportunities that appeal to a diverse group of girls. Future work is needed to determine the optimal dose, reach and fidelity of school-based interventions.

Process evaluation issues

Process evaluation is an emerging but important component of intervention research. In order to move the field forward, it is important for researchers to learn from the decisions that others make when designing process evaluation protocols. Foremost, the TAAG investigators struggled with determining the best methods of assessing process evaluation data. This included the issues of using observations versus self-report, information sources and the ability to measure similar ‘intervention-like’ activities in the control schools. We address these issues below.

While evaluating whether to use observations versus self-report to assess dose, fidelity and reach, the TAAG investigators examined the published process evaluation literature. Resnicow et al. [26] compared the use of trained observers and teacher self-report to examine how best to measure implementation of school health curricula. In short, they found that observational data were more valid and reliable than self-reported data. In the TAAG PE and health education components, trained data collectors used structured observations during visits. An advantage is that observers are specifically trained to be able to detect the extent to which the intervention is delivered with fidelity. Although self-reports from teachers may be adequate to assess what lessons were taught, it is not likely that they would be able to accurately report the extent to which components were taught in accordance with protocol guidelines. Another advantage of observations is that they require researchers to be in the schools during the time that the intervention is being implemented. This assists in understanding contextual factors that may influence program implementation that otherwise could go unnoticed and unreported.

However, there can be problems even with using observational data. For example, one component of the health education lessons, the follow-up to the activity challenge, may have been systematically missed because activity challenges were often returned on a day when observers were not present. Combining observations with teacher self-reported data or increasing the number of observation visits may have alleviated these inadequacies. However, availability of trial resources and school burden must be considered when designing process evaluation methods [13]. In retrospect, resources might have been diverted from less productive process evaluation data collection methods and used for additional observations.

Another ongoing issue is determining the source of process evaluation data. Because TAAG was an environmental intervention with school as the unit of analysis, individual girls were recruited only for measurement activities. Study staff did not have permission to monitor individual participation in out-of-class physical activity programs or participation in activity challenges or promotional challenges. This inability to track participation at the student level and link individual exposure to study outcomes resulted in a major limitation. Developing a strategy to link individual participation with outcomes without restricting program participation would be a useful tool for future school-based programs.

Another limitation is minimal process evaluation conducted in control schools. Annual interviews with principals, teachers and program leaders from both intervention and control schools were conducted, but it was not possible to fully characterize ‘TAAG-like’ programs that may have been occurring in control schools. Others have also struggled with this limitation [2]. Extensive questionnaires and observations would be needed to truly identify the extent to which other similar programs are occurring in control schools—resources that are better used on other trial activities.

In conclusion, process evaluation results indicated that the TAAG intervention was implemented with high levels of reach and fidelity and resulted in altering the school environment. School-based interventions are complex, and the TAAG intervention represents an evolution from previous work by linking schools with community groups to affect change. Process evaluation results clearly indicate that changes were made in the intervention schools’ environment that supported physical activity for girls.


National Heart, Lung and Blood Institute; National Institutes of Health (U01HL66858, U01HL66857, U01HL66845, U01HL66856, U01HL66855, U01HL66853 and U01HL66852).

Conflict of interest statement

None declared.


We acknowledge the contributions of Pamela Carr, Derek Coombs, Rachel Cope, Christine Cox, Jewel Harden, Melanie Hingle, JoAnn Kuo, Dale Murrie, Jenny Nadeau and Lakesha Stevens.


1. Israel BA, Cummings KM, Dignan MB, et al. Evaluation of health education programs: current assessment and future directions. Health Educ Q. 1995;22:364–89. [PubMed]
2. Steckler A, Ethelbah B, Martin CJ, et al. Pathways process evaluation results: a school-based prevention trial to promote healthful diet and physical activity in American Indian third, fourth, and fifth grade students. Prev Med. 2003;37:S80–90. [PubMed]
3. Caballero B, Clay T, Davis SM, et al. Pathways: a school-based, randomized controlled trial for the prevention of obesity in American Indian schoolchildren. Am J Clin Nutr. 2003;78:1030–8. [PubMed]
4. Luepker RV, Perry CL, McKinlay SM, et al. Outcomes of a field trial to improve children's dietary patterns and physical activity. The Child and Adolescent Trial for Cardiovascular Health. CATCH collaborative group. J Am Med Assoc. 1996;275:768–76. [PubMed]
5. Lytle LA, Murray DM, Perry CL, et al. School-based approaches to affect adolescents’ diets: results from the TEENS study. Health Educ Behav. 2004;31:270–87. [PubMed]
6. Pate RR, Ward DS, Saunders RP, et al. Promotion of physical activity among high-school girls: a randomized controlled trial. Am J Public Health. 2005;95:1582–7. [PMC free article] [PubMed]
7. Nicklas TA, O'Neil CE. Process of conducting a 5-a-day intervention with high school students: Gimme 5 (Louisiana) Health Educ Behav. 2000;27:201–12. [PubMed]
8. Story M, Mays RW, Bishop DB, et al. 5-a-day Power Plus: process evaluation of a multicomponent elementary school program to increase fruit and vegetable consumption. Health Educ Behav. 2000;27:187–200. [PubMed]
9. Helitzer DL, Davis SM, Gittelsohn J, et al. Process evaluation in a multisite, primary obesity-prevention trial in American Indian schoolchildren. Am J Clin Nutr. 1999;69(Suppl. 4):816S–24S. [PubMed]
10. Kimm SY, Glynn NW, Kriska AM, et al. Decline in physical activity in black girls and white girls during adolescence. N Engl J Med. 2002;347:709–15. [PubMed]
11. Sallis JF, McKenzie TL, Conway TL, et al. Environmental interventions for eating and physical activity: a randomized controlled trial in middle schools. Am J Prev Med. 2003;24:209–17. [PubMed]
12. Lytle LA, Davidann BZ, Bachman K, et al. CATCH: challenges of conducting process evaluation in a multicenter trial. Health Educ Q. 1994;(Suppl. 2):S129–42. [PubMed]
13. Sallis JF, McKenzie TL, Kolody B, et al. Effects of health-related physical education on academic achievement: project SPARK. Res Q Exerc Sport. 1999;70:127–34. [PubMed]
14. Young DR, Phillips JA, Yu T, et al. Effects of a life skills intervention for increasing physical activity in adolescent girls. Arch Pediatr Adolesc Med. 2006;160:1255–61. [PubMed]
15. Stokols D. Establishing and maintaining healthy environments. Toward a social ecology of health promotion. Am Psychol. 1992;47:6–22. [PubMed]
16. Elder JP, Lytle L, Sallis JF, et al. A description of the social-ecological framework used in the Trial of Activity for Adolescent Girls (TAAG) Health Educ Res. 2006 [PMC free article] [PubMed]
17. Stevens J, Murray DM, Catellier DJ, et al. Design of the Trial of Activity in Adolescent Girls (TAAG) Contemp Clin Trials. 2005;26:223–33. [PMC free article] [PubMed]
18. Webber LS, Catellier DJ, Lytle LA, et al. Promoting physical activity in adolescent girls. Trial of activity for adolescent girls. Am J Prev Med. 2008;34:173–184. [PMC free article] [PubMed]
19. Baranowski T, Stables G. Process evaluation of the 5-a-day projects. Health Educ Behav. 2000;27:157–166. [PubMed]
20. Steckler A, Linnan L. Process evaluation for public health interventions and research: an overview. In: Steckler A, Linnan L, editors. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: John Wiley & Sons; 2002. pp. 11–21.
21. Thaker S, Steckler A, Sanchez V, et al. Program characteristics and organizational factors affecting the implementation of a school-based indicated prevention program. Health Educ Res. 2007;22:155–165. [PubMed]
22. Young DR, Johnson CC, Steckler A, et al. Data to action: using formative research to develop intervention programs to increase physical activity in adolescent girls. Health Educ Behav. 2006;33:97–111. [PMC free article] [PubMed]
23. Murray DM, Stevens J, Hannan PJ, et al. School-level intraclass correlation for physical activity in sixth grade girls. Med Sci Sports Exerc. 2006;38:926–36. [PMC free article] [PubMed]
24. Littell RC, Milliken GA, Stroup WW, et al. SAS for Mixed Models. 2nd edn. Cary, NC: SAS Institute Inc.; 2006.
25. Marcoux M-F, Sallis JF, McKenzie TL, et al. Process evaluation of a physical activity self-management program for children: SPARK. Psychol Health. 1999;14:659–77.
26. Resnicow K, Davis M, Smith M, et al. How best to measure implementation of school health curricula: a comparison of three measures. Health Educ Res. 1998;13:239–50. [PubMed]

Articles from Health Education Research are provided here courtesy of Oxford University Press
PubReader format: click here to try


Save items

Related citations in PubMed

See reviews...See all...


  • Cited in Books
    Cited in Books
    PubMed Central articles cited in books
  • MedGen
    Related information in MedGen
  • PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...