• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of amjpharmedLink to Publisher's site
Am J Pharm Educ. Apr 15, 2008; 72(2): 38.
PMCID: PMC2384213

Use of an Audience Response System (ARS) in a Dual-Campus Classroom Environment

Abstract

Objectives

To implement an audience response system in a dual-campus classroom that aggregated data during graded (attendance and quizzes) and non-graded classroom activities (formative quizzes, case discussions, examination reviews, and team activities) and explore its strengths, weaknesses, and impact on active learning.

Design

After extensive research, an appropriate audience response system was selected and implemented in a dual-classroom setting for a third-year required PharmD course. Students were assigned a clicker and training and policies regarding clicker use were reviewed. Activities involving clicker use were carefully planned to simultaneously engage students in both classrooms in real time. Focus groups were conducted with students to gather outcomes data.

Assessment

Students and faculty members felt that the immediate feedback the automated response system (ARS) provided was most beneficial during non-graded activities. Student anxiety increased with use of ARS during graded activities due to fears regarding technology failure, user error, and academic integrity.

Summary

ARS is a viable tool for increasing active learning in a doctor of pharmacy (PharmD) program, especially when used for non-graded class activities. Faculty members should proceed cautiously with using ARS for graded classroom activities and develop detailed and documented policies for ARS use.

INTRODUCTION

With the decreasing cost of audience response systems (ARSs), many colleges are considering implementation of this technology to bring active learning into their large classrooms and increase direct engagement with lecture content.1 ARSs are designed to promote active learning in the classroom by encouraging student reflection about lecture content and to increase participation in class activities. Active learning is accomplished when an instructor poses a question to the audience in the form of a multiple-choice question which is created through a PowerPoint interface. The students then select their choice using the ARS response card keypad (referred to as “clickers”) and the percentage of students selecting each multiple-choice answer is immediately displayed on the PowerPoint slide.2 The instructor and the class see the responses displayed as aggregated results and individual's responses are not identified. The displayed results provide immediate feedback to students about content mastery while in lecture, which allows instructors to remediate or reinforce concepts during the class session without singling out individual students.3 The system can be used in an anonymous mode or each clicker can be assigned to an individual student using a student list. Overall, teaching with ARSs has met with high approval ratings, enhanced learning, and better alertness of the learners.4

The University of Oklahoma College of Pharmacy recently explored the ARS tool as a way to increase active learning in a live dual-campus environment. The addition of a distant campus challenged faculty members to consider how to best engage a significantly larger number of students on 2 campuses (80 on-site and 60 distant students). Increasing active learning with onsite and distant-site students is an important question for our institution as well as other pharmacy schools since Accreditation Council for Pharmacy Education (ACPE) accreditation standards highlight the need for multisite programs to keep learners actively engaged with the content, faculty members, and peers (see guideline 11.5 in ACPE Accreditation Standards) and the number of pharmacy schools adding distant campuses is increasing.5 Since promoting active learning can be complicated with distant campuses, tools such as ARS may help achieve this outcome. To our knowledge, we are one of the first colleges of pharmacy to use the system to aggregate student responses from multiple sites. This article explores the results of our program's use of an ARS in graded and non-graded classroom activities simultaneously merging results from 2 campuses.

The objectives of this innovation were to (1) evaluate general audience response system technical issues and the ability of an ARS to simultaneously merge student responses from 2 live campuses separated by distance; (2) identify the strengths and weaknesses of the ARS in 3 graded (summative) classroom activities (individual quizzing, team quizzing, and attendance); and (3) identify the impact on active learning and strengths and weaknesses of the ARS in non-graded classroom (formative) activities (individual case discussion, individual problem solving, examination/concept review and gaming, team-based learning exercises such as team problem solving and case discussion).

DESIGN

Selection of an ARS for a multi-classroom distance environment was based on 3 technical requirements. First, the technology had to merge all classroom response data immediately in the synchronous (live) learning environment. Second, the student response devices needed to seamlessly and consistently connect to the system receivers. Third, the ARS needed to be easy for faculty members and students to use. The system that met these 3 requirements was Turning Point 2006 (Turning Technologies, Youngstown, Ohio).

Aggregating student responses from 2 sites with similar class sizes was the first requirement because it would allow faculty members to more effectively engage, manage, and unify all students in both classrooms during real-time activities. Campus specific results could also be reviewed by faculty members after class ended. The exploration of equipment from ARS vendors revealed that there are different mechanisms for merging multiple-classroom response data. We chose the transmission and reception of a handheld response device because it proved more reliable than a personal digital assistant (PDA) or a laptop. We ruled out a web site with special coding due to the anticipated difficulties with network connections for and configuration of the 140 web-enabled devices. We also ruled out a master/slave receiver/server that relied on a standard telephone line connection.

When selecting the data receiving equipment, the second requirement, we chose to use a computer with a remote receiver because it offered a continuous connection that accepted the data from response devices from both locations and only required the faculty member to log into the classroom computer to operate the Turning Technologies software. We chose the multiple receiver system that utilized a remote Internet connection over systems that were hybrid types (types that received handheld response data locally, and received web-enabled response data remotely) because of concerns with wireless network connectivity.

The third technical requirement was a system that could seamlessly fit into an already technologically complex classroom and be easy for faculty members and students to use. The Turning Point software was chosen because it interfaced with PowerPoint, the presentation software used in the classroom.

After selecting the ARS system from Turning Technologies, one course in the doctor of pharmacy program was selected to pilot test the system. While the learning curve for using ARS technology is low, ARS mastery is related to pedagogical issues and instructional design such as the type, number, and frequency of lecture questions to use; slide design; and instructor's abilities to use the class responses/results to engage students in discussion (complicated by having 2 campuses) and guide content coverage.1 As a result of apparent instructional skill and knowledge required to successfully use ARS, a required oncology module course was selected for the system pilot due to the instructor's extensive teaching experience, documented teaching effectiveness, commitment to active learning, historic use of varied activities in lecture, and the level of the learners participating in the course (third-professional year pharmacy students in the spring semester). The instructor (who was also the course coordinator) determined that the ARS would be used for both graded (individual quizzing, team quizzing, and attendance) and non-graded activities (case discussion/problem solving, examination review, and team-based learning exercises). To accommodate these activities, student lists were generated within the system so answers could be linked with a specific student's clicker. A team list was also generated for team activities. The student and team lists were used to facilitate grading and were used in lieu of the anonymous mode feature that prohibits the linking of student with a clicker and clicker responses.

To launch the ARS in the oncology course, ARS technology training for the students was delivered during a special class meeting. There were 4 main agenda items for the meeting: clicker distribution, ARS policy review, discussion of how the ARS would be used in class, and ARS technology training. To supplement the training and facilitate the implementation, course policies were developed for the course and documented in the course syllabus. The clicker distribution policy detailed that the clickers would be distributed at the technology training session and students were accountable for bringing the clickers to every class session. The policy was chosen because distributing the clickers at every class session would be time consuming. This policy also included the penalty for lost clickers (students paid $50 for a replacement clicker) or forgotten clickers (students received a zero for clicker activities that day) and alerted students to the potential of random clicker checks during the semester to promote academic integrity. Testing policies were also developed that reviewed clicker protocol during testing as well as details related to suspected miskeyed answers.

After the semester was completed, 2 separate focus groups were held, 1 for onsite students and 1 for distant site students. Out of 121 students (76 on-site and 45 distant site) that were enrolled in the course, 10% of the students from each campus (8 on-site and 5 distant site) were randomly selected to participate in live focus groups to discuss their perceptions of the strengths and weaknesses of the ARS technology. The 2 separate focus group sessions were conducted by a faculty member not involved in the implementation process to reduce bias.

ASSESSMENT

Results from the first objective revealed that the ARS implementation was successful The instructor easily used the system by creating and displaying slides and was able to add slides during the lecture without technical difficulty, although the instructor used slides prepared in advanced more frequently. Students easily comprehended how to work the clickers, as evidenced by 100% clicker response rates to questions asked in class. Students were also able to immediately see the class's aggregated results, which facilitated classroom discussion and unified the 2 campuses. From specific interviews, the instructor reported enjoying the increased class participation, especially from students at the distant site. The instructor considered this unification and increased ability to manage both live campuses a significant benefit of the system. Of the 8 students from the onsite classroom randomly selected to participate in a focus group, 4 participated; of the 5 students selected from the distant site, 3 participated. Since issues specific to either campus were not raised in the focus groups, the similarity in student comments led to a decision to merge the results. One area related to the ARS technology that was raised in the focus groups was clicker design because students were distracted by the number of available buttons on the system that went unused. There were 12 buttons but the instructor only wrote multiple-choice questions with 4 options. Therefore, 8 buttons were not used in the class and this was disruptive to classroom decorum at times because students would ask, “what is this button for?” and would try to experiment with the other buttons while they were waiting for the entire class to respond. Related to the number of available buttons, students were concerned that they were not able to select more than 1 option. Since selecting multiple options is a system feature, the instructor could create some questions for future course offerings that require students to select multiple responses.

Two issues raised during student focus groups pertaining to the use of the ARS in graded activities (objective 2) were test anxiety and academic integrity (see Table Table22 for detailed faculty and student perceptions of the perceived disadvantages of ARS). The first area of student concerns focused on test anxiety, because students had anxiety that the clickers would not register or would register their answers incorrectly, therefore negatively affecting their grade. Although there was a green light indicator that glowed on the clicker when a response was received, a “responses received” display was visible to the instructor and all students, and the instructor would leave classroom polling open until all students responded and would prompt students to answer the question if a 100% response rate was not obtained, students remained concerned over potential technical failures. Although mis- or non-recorded answers were not experienced by a majority of students, 2 students did have malfunctioning clickers (confirmed by direct faculty observation and system testing). Approximately 7 students reported on separate occasions that they clicked an answer but a different answer was registered, but the veracity of these complaints could not be verified so the students' recorded answers were used. As a result, many students remained skeptical of the accuracy of the technology because students could not verify their answers as they would on a scantron. Students were also unable to review previous questions and change answers the same way they could on a paper test. All students had to answer the test questions at the same time and submitted answers were the final answer. Relatedly, multiple test versions were not feasible. New ARS keypads are being designed with a small computer screen that will allow students to review their results, which may alleviate this concern in the future. However, with the current keypad, the instructor had to build in several non-graded, technology “testing” situations for attendance and quizzes at the beginning of the class to make sure that all clickers were functioning properly and that the students were comfortable with the process, which was time consuming. The instructor also considered using a paper back-up system but decided that this defeated the purpose of electronic testing. Student focus group results also revealed disinterest in using a paper back-up system.

Table 2
Perceived Disadvantages of ARS Use

The second area of concern raised by student focus groups was related to academic integrity with the clickers. Since only one version of the quiz was available and the class completed each question at the same time, students could potentially see what button other students were pressing. A more serious problem was that one student could be in possession of more than one clicker, which could allow one student to check in for attendance and/or complete quizzes for multiple students. The instructor addressed this issue by informing the class that a head count would be taken at each class meeting to reconcile the number of clicker responses received and that discrepancies would result in a class-wide penalty.

The third objective for this pilot was to utilize the ARS in non-graded (formative) activities (individual case discussion, individual problem solving, examination/concept review and gaming, team-based learning exercises such as team problem solving and case discussion) and explore the impact on active learning. See Table Table33 for detailed faculty and student perceptions of the benefits of ARS. For class activities, ARS increased student participation, which was facilitated by the instructor delaying discussion until the entire class responded, monitored using the “responses received” display. While the amount of participation could be confirmed, it was difficult to determine the level of students' “active” engagement with questions. However, when students' answers to the posted questions were reviewed in class, a random pattern of wrong answers was not revealed, which suggests that students were not just randomly pressing a button to respond. The activity engaged students and motivated them to deliberately respond. Student focus groups reported they were most satisfied with the ARS for non-graded class activities and that they took these activities seriously and as opportunities to increase their learning.

Table 3
Perceived Benefits of ARS Use

Among the learning activities (ie, post-lecture reviews, pop quizzes, team-based learning exercises), participants most enjoyed using ARS with team-based learning, according to focus group results. Students also liked ARS for formative post-lecture quizzes. Students felt that ARS was a good way to “test” their attention to and comprehension of lecture content. Consequently, students could be actively reminded where they need to review course content. One participant stated that ARS honed her studying ability. She received feedback on how she was doing: “I've just got a zero out of the three questions…I need to go back and study that section.” Additionally, participants advocated the use of ARS for pre- and post-lecture quizzes over new concepts to make sure students focused attention and comprehended the concept. Student responses from the focus group revealed that ARS also increased preparation for class and engagement of course content as a result of peer comparison gained by the display of class results. During the student focus groups, 1 student reported an experience in which she answered a question incorrectly that the majority of her classmates answered correctly. Although she was not publicly identified, she felt negatively reinforced to study more to avoid feeling personally embarrassed for answering incorrectly in future ARS activities. The student focus groups reported favoring the opportunity for peer comparison.

Overall, there was an underlying participant consensus that students preferred ARS for formative discussion quizzes more than summative graded quizzes. The formative quizzes reduced student anxiety and helped reinforce their learning. “It (ARS) offers immediate feedback; you get your answer and if you're wrong then there's no punishment for it…it's just when there are grades relying on it that it gets a little scary.”

DISCUSSION

The use of ARS by faculty for classroom teaching in a dual campus environment yielded 3 outcomes for institutions with one or multiple campuses considering the implementation of this system. First, the ARS simultaneously merges and broadcasts results from both campuses in graded and non-graded activities. Students reported learning the system easily, despite some concerns with the overall clicker design. The instructor indicated that system implementation and training was manageable and was facilitated by documented ARS policies in the course syllabus and a special class meeting launch. The instructor also reported that learning the system was easy, although designing a case question slide so it was legible was a concern. To reduce these design problems, the instructor infrequently created question slides during class. Instead, the majority of questions were prepared in advance of the lecture and focused on questions that (1) reviewed vital course content, (2) revealed gaps in student understanding, or (3) reviewed concepts that were confusing to students from previous years in the course. One limitation the instructor discovered was that students' answers to ARS questions saved in the ARS database were unable to upload directly into the Blackboard grade book, although that feature may be available in the future. This required extra, time-consuming downloading and input steps by the instructor. Overall, the ARS technology worked well, met technical requirements, and fit seamlessly into the course for students and instructor.

Second, ARS was used in 3 graded (summative) activities: individual quizzing, team quizzing, and attendance. Perhaps the greatest appeal with ARS is the ability to take attendance and offer quizzes. However, the speed of offering, collecting, and compiling attendance and quiz responses competes with the time needed to design a process that upholds academic integrity or the time needed to either distribute clickers at every class or verify possession of an assigned clicker, which could negate the time benefits of this technology. Many faculty members want to use ARS primarily for attendance and quizzes and the literature reports how ARS streamlines this process.1 However, failure to consider academic integrity in the use of ARS for graded activities is detrimental for a course. Students and faculty members at our college report that using AMS is not an ideal tool for graded activities such as attendance and quizzes. While ARS appears to present an easier way to administer quizzes and take attendance, it requires a detailed syllabi with specific policies addressing the items above, and attention to examination proctoring and examination security. Therefore, faculty members with strong proctoring skills, refined testing policies and procedures documented in course syllabi, as well as access to multiple proctors to verify student possession of only 1 clicker may be better candidates to use this system for attendance and graded quizzes. The lack of ability to offer multiple examination versions due to constraints on self-paced test item completion also added to the academic integrity threats. In addition to academic integrity concerns with ARS use in graded activities, students also reported increased anxiety due to fears of technology failure or user error. Faculty members should proceed cautiously for ARS use in graded classroom activities and develop detailed and documented policies for addressing the apparent limitations (see Table Table11 for areas to consider for successful ARS implementation and use). It is possible that new clicker software will address some of these limitations and academic integrity issues.

Table 1
Areas to Consider for Successful Implementation/Use of Audience Response Systems (ARS)

Third, students and the instructor favored the system for non-graded, formative lecture activities such as informal quizzing, team problem solving, examination/concept review, gaming, and case discussion because it increased active learning by raising students' awareness of their learning (or lack there of) due to the immediate ARS feedback and subsequent instructor content reinforcement and discussion. The instructor liked the increase in active learning, classroom participation, and the ability to quickly ascertain the classes' understanding of topics that were presented. Students liked the increased discussion of the answers to questions that were posed to the class. Since a majority of the class responded to content questions, the instructor could better judge if a majority of the class needed additional and immediate topic review versus a few students that may benefit from further discussion outside of class time. Without ARS, an instructor may inaccurately estimate what concepts need additional and immediate review (missing valuable teachable moments) until after the lecture ended or after an examination. Although the instructor could later clarify or readdress the concept at the next class period or send an e-mail, the students felt this might be too late. ARS also allowed faculty members to better gauge when a topic discussion should advance to the next topic.

Related to increased class discussion, the instructor noted one additional benefit. ARS increased student-faculty contact on both campuses compared to previous non-ARS classes where the distance technology compromised a faculty member's ability to read distant site students' nonverbal cues, such as student cues signaling confusion. Even though ARS did not increase faculty members' ability to read students' nonverbal cues, it offered objective feedback about students' comprehension of course content in real-time. Students also felt that the system promoted enhanced learning because they could compare their understanding to the entire class. In the traditional settings, if students answer questions incorrectly, they may not know if a majority of the class answered incorrectly. ARS reveals a compilation of class thinking and offers instructors the opportunity to reinforce understanding and remediate misinformation. Students reported that this peer comparison prompted them to seek further remediation through either self-study or instructor assistance. These results suggest that ARS can enhance learning, although future studies should better quantify the actual impact on learning.

Overall, for non-graded (formative) activities ARS is an appealing tool, but for graded (summative) activities there are academic integrity issues that limit the system's appeal. For any activities, faculty members should consider meaningful implementation of ARS throughout the class versus sporadic use, and capture student feedback about strengths and weaknesses of its use in the classroom in order to integrate the tool as a long-term solution for increasing active learning in the classroom. Future studies should gather input about student perceptions and satisfaction with ARS from an increased number of students since small student focus groups are a limitation with the current study. In addition, future studies should explore more quantifiable outcome measures that would impact active learning such as the number of times ARS results necessitated further teaching of a concept since the limited number of quantifiable outcome variables is another limitation of the current study.

SUMMARY

Overall, the use of new ARS technology in a dual-campus classroom increased faculty members' ability to actively engage onsite and distance-site students' interaction with course content. The technology appeared to engage students and motivate students to actively participate in answering ARS questions, made the majority of students' thinking more visible, offered faculty members timely feedback about student learning, increased student faculty contact on both campuses, and allowed faculty members to deliberately reinforce important content or review confusing concepts in a timely fashion. Observation of ARS activities revealed that students adjusted to the new technology. Although a separate focus group for each campus was conducted, similar results were yielded revealing that timely instructor feedback, discussion, remediation, and peer comparison increased their learning. Although issues arose with the use of ARS in graded activities, future investigation will explore the impact of the new clicker software on graded and non-graded activities, faculty teaching, academic integrity, and student learning outcomes.

REFERENCES

1. Rodgers ML, Starrett DA. Calling all students, come in, students. National Teaching and Learning Forum. 2006;15(5):34.
2. Steeter JL, Rybicki FJ. Education techniques for lifelong learning: A novel standard-compliant audience response system for medical education. Radiographics. 2006;26:1243–49. [PubMed]
3. Menon AS, Moffett S, Enriquez M, Martinez MA, Dev P, Grappone T. Audience response made easy: Using personal digital assistants as a classroom polling tool. J Am Med Inform Assoc. 2004;11:217–20. [PMC free article] [PubMed]
4. Robertson LJ. Twelve tips for using a computerized interactive audience response system. Med Teach. 2000;22:237–9.
5. Accreditation Council for Pharmacy Education Accreditation Standards and Guidelines for the Professional Program in Pharmacy Leading to the Doctor of Pharmacy Degree, 2007 http://www.acpe-accredit.org/pdf/ACPE_Revised_PharmD_Standards_Adopted_Jan152006.DOC Accessed August 8, 2007.

Articles from American Journal of Pharmaceutical Education are provided here courtesy of American Association of Colleges of Pharmacy
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

  • MedGen
    MedGen
    Related information in MedGen
  • PubMed
    PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...