Teaching teens to ‘‘Get Net Smart for Good Health’’: comparing interventions for an Internet training program *

With their casual acceptance of information on the Internet, high school students are a tough audience for librarians. A detailed study of 12 middle and high school students in 2003 revealed that 77% of the students' searches for answers to health questions led them no farther than a search engine's first links [1]. Another study based on two dozen focus groups (157 teens) reported that students acknowledged lacking the skill to evaluate Internet sites [2]. An anecdotal report from National Public Radio suggested that, even in college, students forgot to verify sources unless asked to do so—and that faculty rarely asked [3]. These habits can become entrenched by adulthood, and as librarians, educators, and computer systems analysts agree, most Internet users are guilty of taking information on the Internet at face value, even if they know what steps to take to evaluate content [4–6]. 
 
Given this challenge, a team of outreach specialists at the Medical University of South Carolina (MUSC) Library in Charleston, South Carolina, experimented for three years with ways to teach high school students to critically evaluate health websites. The team's goal was to develop and compare interventions for teaching students how to evaluate health websites and transfer what they had learned to new situations. Using the results discussed here, the team developed an outreach program, “Get Net Smart for Good Health,” that engages students in active learning through a hands-on computer workshop and a role-playing activity called “Cyber Court.” A program description and portable document format (PDF) file of the “Get Net Smart for Good Health” curriculum are available online [7]. This paper describes program development and evaluation results for 2005 to 2007.


INTRODUCTION
With their casual acceptance of information on the Internet, high school students are a tough audience for librarians. A detailed study of 12 middle and high school students in 2003 revealed that 77% of the students' searches for answers to health questions led them no farther than a search engine's first links [1]. Another study based on two dozen focus groups (157 teens) reported that students acknowledged lacking the skill to evaluate Internet sites [2]. An anecdotal report from National Public Radio suggested that, even in college, students forgot to verify sources unless asked to do so-and that faculty rarely asked [3]. These habits can become entrenched by adulthood, and as librarians, educators, and computer systems analysts agree, most Internet users are guilty of taking information on the Internet at face value, even if they know what steps to take to evaluate content [4][5][6].
Given this challenge, a team of outreach specialists at the Medical University of South Carolina (MUSC) Library in Charleston, South Carolina, experimented for three years with ways to teach high school students to critically evaluate health websites. The team's goal was to develop and compare interventions for teaching students how to evaluate health websites and transfer what they had learned to new situations. Using the results discussed here, the team developed an outreach program, ''Get Net Smart for Good Health,'' that engages students in active learning through a hands-on computer workshop and a role-playing activity called ''Cyber Court.'' A program description and portable document format (PDF) file of the ''Get Net Smart for Good Health'' curriculum are available online [7]. This paper describes program development and evaluation results for 2005 to 2007.

BACKGROUND
''Get Net Smart for Good Health'' is part of a larger outreach effort aimed at residents of South Carolina, a state with one of the lowest high school completion rates and literacy rates in the nation [8]. In 2002, the MUSC Library created a Website for laypeople called Hands on Health-South Carolina with initial funding from the Duke Endowment [9]. Collaborators in the site's development included the South Carolina State Library, the University of South Carolina School of Medicine Library, and the South Carolina Hospital Association. Hands on Health-South Carolina is a gateway to reputable sites like MedlinePlus, with additional content written for readers with low literacy skills. The site is currently being translated into Spanish. The site is tailored to address the needs and interests of South Carolina residents with special features like ''Go Local-SC,'' which directs readers to local services, many of them free or low cost. Further information about the National Library of Medicine's Go Local initiatives is available at the National Library of Medicine Internet site [10]. The MUSC team uses Hands on Health-South Carolina and MedlinePlus extensively in the ''Get Net Smart'' program, as well as in workshops for other target audiences.

METHODS
The project involved a combination of structured workshops and hands-on activities, held at local high schools with 10th and 11th graders, from year 1 (2005) to year 3 (2007). Classes of students enrolled in a health sciences vocational curriculum were selected by their teachers to participate in the program based on their interest in pursuing careers in science and health care. Workshops included a pretest and posttest, structured instruction, a health-related website evaluation exercise, and a satisfaction survey. Workshops were modified in year 2 to include a role-playing component, and an evaluation of the role-playing exercise was added to the program in year 3. The project was approved by the MUSC Institutional Review Board.

Workshops
The workshops took place at four different schools over three years, with no students repeating a workshop. The average size of participating high school classes was twelve students, and classes were composed of all tenth graders, all eleventh graders, or a mixture of tenth and eleventh graders. Workshops were held at MUSC with the exception of workshops for the tenth grade classes from school A in year 2 and the tenth grade classes from school D in year 3, due to shortages of funds for transportation costs.

Participants
During year 1, MUSC staff developed the first workshop with the help of a health sciences teacher at North Charleston High School (NCHS) (school A in Table 1) to ensure that the program met state curricular standards. NCHS is an inner-city school, serving a 90% low-income, minority student population. Nearly all the students were African American. One class at each of 2 suburban high schools (schools B and C) with 40% minority students also attended the workshop in year 1. In year 2, students taking the workshop attended 1 school serving 90% low-income, minority students (school A) and 1 school with a 40% minority constituency (school B). In year 3, student participants attended an urban magnet school serving primarily low-income minority students (school D).

Curriculum
At each school, the team conducted a hands-on Internet workshop using the ''Get Net Smart'' curriculum. Workshops were preceded by brief pretests related to website evaluation (Appendix A online). The curriculum included guidelines on how to evaluate consumer health websites, recommendations of reputable sites, a site evaluation exercise, and a role-playing activity. Students learned to consider six credibility factors: site sponsorship, accuracy of content, author qualifications, editorial policy, privacy protection, and contact availability (Appendix B online). After demonstrating how to evaluate consumer health sites, the presenters allowed students a little time to explore the sites on their own. Students also took a posttest identical to the pretest.
At the end of the workshop, the trainers asked all students to complete an exercise in which they ranked 2 different websites using each of the 6 credibility factors listed above with a score of 1 (very bad) to 5 (very good). Team members assigned each student 2 websites to review from a list of a dozen sites the team put together to reflect typically reputable and objectionable consumer health sites. While the sites differed in content, their adherence or lack of adherence to the credibility factors was comparable. The ideal average score for a reputable or high credibility site was 24-30, and the ideal average score for an objectionable or low credibility site was 6-12. Students also filled out a satisfaction survey assessing their perceptions of the workshop.
During the year 1 evaluation process, a higher than expected change in test scores, combined with the poor comprehension students exhibited with the web exercise, indicated that the pretest/posttest instrument needed redesign. The team decided to replace the short-answer format with a multiple-choice format and introduced a revised instrument in year 2. While the revision included new items, the team calculated multiyear evaluation results using only the two items that were used every year. Following the revision process, the team held the workshop with students at schools A and B in year 2. The pretest/posttest (Appendix A online), website exercise (Appendix B online), and satisfaction survey were again administered. That year, the team experimented with an additional role-playing activity to add an element of active learning to complement the largely didactic workshop [11]. This activity, ''Cyber Court,'' entailed small groups of three or four students preparing a hypothetical courtroom scene to dramatize whether a particular health site was reliable enough to be considered good evidence. Students had to say, in their own words, why a particular website was or was not reliable and why that mattered, given a set of circumstances. As teachers felt that eleventh grade students would be better equipped to take on the roleplaying exercise, only eleventh grade students participated in the exercise.
In year 3, the MUSC team held the workshop with and without the role-playing activity at a technology magnet school (school D) with a 90% minority population. The team added a process evaluation of the role-playing activity to the other instruments to see if students found it beneficial. This evaluation consisted of a survey with both Likert-scale and shortanswer components (Appendix C online). The team combined student self-reporting with pretest/posttest results and instructor observations of student behavior in the preliminary evaluation reported here.

RESULTS
Year 1 Year 1 results indicated that the 46 participating students were not very engaged. Satisfaction indicators showed low interest: class averages of students' responses to the workshop ranged from 2.75 to 3.7 on a Likert scale of 1 to 5, with a response of 1 indicating ''not interested'' and a response of 5 indicating ''very interested.'' Students' answers on the website exercise indicated confusion about how to evaluate sites. Students wrote comments like ''Get students more involved'' and ''Give students more time to explore on their own.'' Some students evaluated sites only according to what they found interesting. A few made astute observations, such as: ''This was not a good site. It was flashy but no updated information date was posted'' [12]. But in general, students made cursory evaluations and mistook quantity of information for quality. Results from the pretest and posttest indicated improvement, but because the high rate of improvement failed to coincide with signs of comprehension on the website exercise, and, even with improvement, no more than 69% (11/16) of any 1 class answered questions correctly, the team decided to substantially revise both the workshop and the pretest/posttest (Tables 1 and 2 online). Team members asked students to rank sites according to the 6 credibility factors noted earlier using a score of 1 (very bad) to 5 (very good) for each one. According to year 1 results, the students tended to rate low-credibility sites highly. They generally ranked high-credibility sites with an appropriate number of points, but the range in points awarded to each site in each category was also high (10-14 points), indicating a lack of consistency in the class ( Table 2 online). Based on these results, the MUSC team streamlined the workshop to make sure that key issues were reinforced and extraneous content dropped. A multiple-choice format replaced a shortanswer format for the pretest/posttest.

Year 2
Interest among students rose from year 1 to year 2, with the average satisfaction score for the 44 students who participated in year 2 rising to 4.0 (scale51-5). Pretest/posttest data indicated that in addition to enjoying themselves more, year 2 students also were better able to evaluate websites after participating in the more focused workshop. Year 2 students were drawn from 5 classes. In 3 of those classes, less than 50% of the students (6/21, including almost all the 11th graders) answered the 2 pretest questions correctly. On the posttest, all but 1 class showed a rise in accuracy, with the result that 73% of all the students (32/44) from that year answered all questions correctly on the posttest. One class had relatively high pretest scores (64%, or 7/11) but only modest improvement (a rise of 27 percentage points, from 7 to 10 of 11 students answering correctly).
The website exercise in year 2 consisted of assessing one rather than two sites, allowing students more time to evaluate, and students did a better job of assigning appropriate values to Internet-based health sites. No apparent differences in performance emerged between urban classes (with a high proportion of minority students) and suburban classes. The team observed that students seemed to perform best with access to a well-equipped computer lab and sufficient class time. A two-hour workshop with time for a break seemed optimum [13].

Year 3
In year 3, 52 students in 4 classes attended the workshop. Student interest in workshop presentations averaged 3.7 (scale51-5), with 11th graders demonstrating more interest (4.0) than 10th graders (3.4). Less than 50% of the students (15/52) answered all 3 questions correctly on the pretest. On the posttest, however, increases for all classes were over 100%, and 77% of all students (40/52) answered all questions correctly on the posttest. While the workshops took place at a new school in year 3 (school D), the learning context was similar to that of NCHS (school A in years 1 and 2): the teachers and the health sciences curriculum were comparable, and the students were 10th or 11th graders and 91% (20/22) African American. However, because students attending school D, a magnet school, expressed career interest in technological vocational training, they may have been more highly motivated to learn Internet evaluation techniques.
In year 3, the team also introduced an evaluation for the ''Cyber Court'' role-playing exercise, in which students defended or disputed the case for using a particular website as evidence in a hypothetical court case. As noted, this exercise was only implemented in 11th grade classes (n522 students). Most (n516/22) of the students preferred the combination of workshop and role-playing activity to either one alone (Table 3 online). This finding was particularly true of students who reported greater understanding, more enjoyment, and higher confidence in their ability to evaluate sites. No students reported preferring the workshop without role playing. Twenty-seven percent (6/22) reported preferring role playing without the workshop. But 73% (n516/22) said they preferred the combination of workshop and role-playing activity because it increased their confidence in their ability to evaluate sites.

DISCUSSION AND CONCLUSION
This study had several limitations. Because the MUSC team relied on a small, non-randomized convenience sample of students, the results cannot be considered representative of the state's high school students. The school populations may have had unidentified differences, despite similarities among schools of the same type. For example, only one of the two urban schools was a vocational magnet school. Some change in level of performance could also be attributed to revisions in the program and the pretest and posttest after year 1. Improvements in years 2 and 3 could attest also to trainers' improvements in delivery. Some findings in year 3 are based primarily on student self-reporting and trainer observation, and hence they require validation with more rigorous methods.
Despite these limitations, year 2 and 3 results suggested that ethnicity and type of school had no discernible bearing on students' performance and that an extra year of instruction and maturation did not improve students' abilities to critically evaluate websites. The variability of results suggests that individual students' skills influenced performance, and classroom dynamics (level of collegiality) and time of day (early morning or immediately before lunch) might also have.
In year 3, students with low pretest scores improved more dramatically than students in year 2 did. In both years, eleventh graders showed less improvement than tenth graders, perhaps indicating that sophomores might be more receptive to training than juniors who were one year closer to leaving high school. Taking time to question students about how they use the Internet outside the classroom and teachers about classroom dynamics could improve effectiveness.
Finally, year 1 results, which indicated low engagement with and understanding of the workshop content, emphasized the importance of planning for a test phase in the development of a new program.
This study also contributes to the limited amount of literature related to how adolescents and students of the Millennial generation (the first generation raised with access to the Internet) evaluate Internet health sites by examining average students with variable computer skills [14]. Existing literature typically focuses on self-perceptions and close observation of students who have self-selected based on interest in using the Internet [1,2]. The combination of workshop and role-playing activity in ''Get Net Smart'' provides the opportunity to test and observe both individual and group uses of the Internet by students who are selected for their interest in health careers, not the Internet, per se. Additional research with teens could target specific interests (such as those of teens who are HIV/AIDS positive) [15] and could incorporate digital advances, such as gaming in virtual environments, to which these students readily respond [16,17].
For librarians, there are several potential benefits to using a program like ''Get Net Smart'' with small groups of students. Librarians can become hidden by the impersonal nature of communication in the digital age. Outreach activities allow librarians to forge more personal connections with a generation learning to think differently about information and resources [17,18]. Students can gain a better appreciation of the range of services librarians can provide. Also, the potential exists for expanded community involvement when librarians enter the classroom: they can participate in or advise parent-teacher association meetings, school health fairs, after-school programs, adult education classes, and service-learning projects in which students teach Internet evaluation skills to family members and others in their community.
Teenagers will probably continue to take the Internet for granted, but there are positive signs that they can be encouraged to use it safely. This study found that role-playing reinforces students' ability to critically evaluate sites, particularly if it is preceded by intensive and focused skill-building activities for which adequate class time is allowed. Additionally, the study suggests that such training earlier in students' high school careers is more effective. Librarians can play a key role in guiding adolescents' interactions with web-based health information through programs such as ''Get Net Smart.''