Flipping the classroom to teach systematic reviews: the development of a continuingeducationcourseforlibrarians

Objective: The researchers used the flipped classroom model to develop and conduct a systematic review course for librarians. Setting: The research took place at an academic health sciences library. Method: A team of informationists developed and con-ducted a pilot course. Assessment informed changes to both course components; a second course addressed gaps in the pilot. Main Results: Both the pilot and subsequent course received positive reviews. Changes based on assessment data will inform future iterations. Conclusion: The flipped classroom model can be successful in developing and implementing a course that is well rated by students. This study describes the current state of Canadian university health sciences librarians’ knowledge about, training needs for, and barriers to participating in systematic reviews (SRs). A convenience sample of Canadian librarians was surveyed. Over half of the librarians who had participated in SRs acknowledged participating in a traditional librarian role (e.g., search strategy developer); less than half indicated participating in any one nontraditional librarian role (e.g., data extractor). Lack of time and insufficient training were the most frequently reported barriers to participating in SRs. The findings provide a benchmark for tracking changes in Canadian university health sciences librarians’ participation in SRs.


INTRODUCTION
As expert searchers with backgrounds in information retrieval and organization, health sciences librarians can add value to systematic review teams. To prepare librarians for leadership roles on such teams, the authors developed an intensive dual-mode workshop that combines instruction in best practices with a capstone project emphasizing institution-specific application of the acquired skills.
Expert searching has long been acknowledged as a core function of health sciences librarians [1,2], and the perceived value of librarian-conducted expert search activities has been documented [1,3]. Librarian involvement in complex searches, including systematic reviews and meta-analyses, is a natural extension of experience with mediated expert searches. In addition to designing and constructing thorough and replicable searches, librarians can add value to systematic review teams by forming answerable research questions, identifying information resources, collecting and managing search results, and writing descriptive methodologies [4]. In their recent survey of emerging roles for health sciences librarians, Crum and Cooper reported that 46% of respondents already supported systematic reviews and an additional 18% planned to become involved in the near future (n5258) [5]. One of the broad barriers to assuming the emerging roles that Crum and Cooper identified was lack of knowledge or skills [5].
To address this need, a project team from the University of Michigan's (UM's) Taubman Health Sciences Library (THL) developed a pilot course to train librarians to participate on systematic review teams. The pilot course was funded by the Greater Midwest Region (GMR) of the National Network of Libraries of Medicine (NN/LM) in spring 2013. Funding supported the team's time for course development and costs of course materials. Participants who completed all online and in-person components of the pilot course earned twenty Medical Library Association (MLA) continuing education (CE) credits.

COURSE DEVELOPMENT
The pilot consisted of both an online learning environment and an intensive in-person workshop, and was developed utilizing the ''flipped'' model of instruction. In this model, didactic materials (generally, lectures or readings) are available to students in advance of the course meeting, which frees up classroom time for discussions or interactive exercises that support active learning [6]. Flipped classrooms are gaining increasing recognition in academic settings, including the health sciences, as a means of developing critical thinking skills and fostering assimilation and application of course material [7]. In this course, didactic materials were presented online prior to the in-person sessions, which afforded participants ample opportunity to process and digest information before entering the workshop environment, where the focus shifted to application. The overall goals of the course were to increase participants' knowledge of best practices in conducting systematic reviews and to facilitate participants' creation of a personalized action plan to establish their libraries as recognized centers of expertise for systematic reviews at their home institutions.
Three informationists with experience conducting and coauthoring systematic reviews collaborated to develop and produce course content. The process of developing and conducting the course is summarized in Table 1. Modelling the course's organization on a workshop that the University of Pittsburgh's Health Sciences Library System conducts semi-annually, the team scripted brief (about twenty minutes) modules and created learning activities for both the online and in-person sessions. Three external peer reviewers provided feedback on the course outline, draft slides, and draft scripts for the online modules. After incorporating reviewer feedback, the team recorded the videos using Camtasia and developed a web platform to host course content.
Google Sites was selected for the pilot to leverage both users' familiarity with Google products and the ease of set up and use. Like many content management systems, Sites provides templates and ready-made structure for content that obviated the need to spend large amounts of time coding. Additionally, Sites offered easy integration with other products, including Google Drive and YouTube, which were used for course delivery and participation. Course modules were hosted on YouTube, and online assignments utilized Drive products including the spreadsheet, word processor, and presentation applications.
While Sites enabled the creation of a functional learning platform, many planned course attributes were not completely realized due to product limitations. First, implementing online office hours was challenging. While Google Hangouts would have enabled simultaneous video-chatting with multiple users, participants would have received email invitations for every scheduled chat, and these frequent alerts would have been frustrating for users. Embedded chat widgets were unsuitable as there was no way to archive chats and the widget deleted content when the chat reached a certain length. Ultimately, a side-byside Google Form and Spreadsheet were used, which allowed both real-time question-and-answer interactions and a full archive of all information exchanged.
In addition to the above limitations, several unexpected issues with Google Sites posed difficulties. User permissions were difficult to navigate as participants with personal Google accounts had different access than those with institutional email accounts. Additionally, firewalls and browser incompatibilities led to imperfect web interfaces that decreased the overall effectiveness of the site for some users.

Pilot course
The cohort for the pilot workshop included fourteen health sciences librarians from six universities, colleges, or academic medical centers in Michigan and Illinois, and two Detroit-area teaching hospitals. Didactic course materials were presented online in an asynchronous, on-demand format and included nine prerecorded lectures and related online activities. Participants were given access to the full course platform-including all lectures, slides, additional readings, and activities-two weeks prior to the inperson workshop.
The two-day, in-person workshop reinforced core concepts from the online modules via brief lectures, individual and small-group activities, group discussion, and question-and-answer sessions. Sample activities included role-playing exercises to handle difficult team dynamics, hands-on practice assessing research databases and use of filters, evaluation of sample methods sections of published reviews, and a discussion of data-management and reporting standards.
A unique component of the course was the capstone project, a personalized action plan tailored to the unique needs, missions, organizational goals, and resources of the librarians' home institutions. To create the action plan, participants considered individual strengths or areas for growth pertaining to the systematic review process, mechanisms for establishing expertise at their home institutions, identification of institutional champions to help facilitate systematic review efforts, development of a promotional plan, and establishment of personal goals for both threeand six-month time frames. The personal action plan was a critical element of the course, enabling participants to amalgamate the full learning experience into a customized and tangible outcome.

Evaluation, assessment, and response of the pilot course
Deploying a multi-pronged approach to assess learning outcomes and student satisfaction allowed the team to gather a wide range of data to influence future iterations of the course. Core assessment instruments The methodology for data coding and analysis was influenced by grounded theory, which ''generates new theory from data, as opposed to testing existing theory'' [8]. Using the grounded theory approach to coding freetext assessment data-including responses to the MLA course evaluation, the online survey, and the focus group notes-facilitated identification of main themes. Three informationists examined data and categorized content by labelling emerging themes. One informationist analyzed the independently coded data and grouped results into main themes: technology, instruction, hybrid delivery, content, and activities.
Both quantitative and qualitative assessments were overwhelmingly positive: almost all participants graded the course ''A.'' A recurrent theme across the multiple assessments was that the hybrid structure of the course was enjoyable and facilitated and strengthened learning. When asked in the online survey about the course structure, 100% of participants ''agreed'' or ''strongly agreed'' that the online and in-person course components facilitated learning. Additionally, the project team administered a 6month follow up survey (Appendix C, online only) to understand how the course impacted participants professionally. The greatest impact at the 6month point was that respondents (n511) chose to continue learning about systematic reviews. Five of the respondents indicated that the action plan helped them, and 4 respondents described how the course helped them with their search skills and confidence.
While feedback from the various instruments indicated high levels of student satisfaction, some participant comments in the data, as well as instructor self-assessment, noted a few key areas for course improvement. Identified areas included changing the course management platform, adjusting the timing and variety of activities, and presenting more examples of project deliverables, such as completed searches or citation reports.
Following a course debriefing session and analysis of all assessment data, the project team began to prepare for future iterations. Materials were modified to include more content related to data management, reporting, and documentation. Additionally, there were significant changes to both the online course and assessment plan.
The first significant change was retooling a group project that had served as a final activity for the online section of the pilot course. In this activity, participants worked in small groups to develop brief summaries of an online learning module, which were presented in the first session of the in-person workshop. The intention was that the group work would create community among the participants, facilitating an effective transition from the online to in-person environment, as well as serving as a ''refresher'' of online content prior to beginning the in-person workshop. Unfortunately, participation issues created complications for both participants and the instructors, and this activity was the lowest-ranked component of the pilot course. As a result, the instructors changed the online group project to an independent exercise and replaced the in-person group report-out session with an introduction activity.
The most significant change was to the course platform. The instructors felt that the pilot website, while functional, had vast room for improvement in the areas of real-time communication, user permissions, and browser incompatibilities. The decision was made to migrate to a new platform, and an instructional designer was added to the project team. He compared five platforms, focusing on user permissions, ease of navigation, ability to embed objects, and browser compatibility. PBworks ,http://www .pbworks.com., a wiki tool that allows simple editing and linking between pages, was ultimately selected and deployed using UM's campus license.
The assessment plan was also modified: while the MLA evaluation was still used to certify CE credits, the new pilot course assessment focused on perceived gained knowledge. After consulting with assessment experts at UM's Center for Research on Learning and Teaching, the instructors created a single postcourse survey that asked participants to consider their knowledge in nine areas before taking the course and afterward. This survey captures two time frames in one instrument and reduces the chance of response shift bias.

Post-pilot course
The first post-pilot offering of the course was held in April 2014. The cohort included librarians from a variety of library types and geographic locations, and with varied levels of experience with systematic reviews. Experience or involvement in the systematic review process ranged from early career novices to highly experienced participants with multiple coauthorships. The varied levels of experience challenged the course instructors to adequately address the diversity of comfort levels with systematic reviews, literature search skills, and course expectations. Unlike the pilot cohort, most participants did not know each other beforehand, which increased the value of the in-person workshop as participants were able to network with new colleagues.
This course was also a success: thirteen of fourteen participants graded the course an ''A'' on the standard MLA evaluation, and the protocol example and action plan were identified as useful takeaways. Other highlights during the workshop included discussions on best practices in literature searching versus the challenges presented by resource access, time constraints, team dynamics, and how to balance sensitivity, specificity, and reality when dealing with large results sets. Changes based on feedback from the pilot workshop to focus on search construction and resource features were less successful due to the cohort's varying levels of search expertise. Some participants wanted more hands-on practice, while others were already highly proficient in search construction and execution.
In spite of the challenges of meeting the needs of a cohort with diverse levels of experience, results from both the redesigned post-course survey and the MLA evaluation form indicated growth in participant learning and high levels of student satisfaction. Additionally, the online survey indicated learning, especially in areas related to writing methods sections, reporting systematic reviews, and understanding the importance of comprehensive searching.

NEXT STEPS AND CONCLUSION
Given the positive assessments of both courses and ongoing demand for formal training in best practices for systematic reviews, additional sessions have been scheduled for winter 2014 and spring 2015. Prior to these courses, the team will address technical issues, including enhancing online office hours-ideally, with a platform-native, embedded chat tool-and providing instructors the opportunity to comment on specific text in online assignments. Additional content will also be developed. Finally, a tutorial to help participants submit online learning activities will be added to the YouTube playlist for the online course. Moving forward, the team will continue to adapt course content to both the systematic review landscape and the diversity of future cohorts.
This course prepares librarians to understand the role of systematic reviews in evidence-based health care and provides training in conducting an exhaustive and reproducible literature search, documenting the search process, and delivering organized and complete results. Additionally, the development of a personalized strategic plan prepares librarians to promote their skills in systematic reviews in their home institutions. The flipped classroom model was successful and popular with students, and the authors believe this course provides librarians with valuable skills for participating on systematic review teams.

INTRODUCTION
Librarians working in university settings typically work under the philosophy and practice of teaching researchers or practitioners to do their own comprehensive literature searches and act as a resource for guidance on database selection, search strategy development, and so on. The exception is systematic reviews (SRs), where librarians' expertise in literature searching is essential for a comprehensive and replicable search of the existing literature, as indicated by the Canadian Institutes of Health Research and the Institute of Medicine [1,2].
With any significant change to or philosophical shift in professional practice resulting from changing demand, issues and challenges need to be identified and addressed by university health sciences library administrators, librarians, library schools, and library associations. Crum and Cooper's recent findings on