Characteristics and Content of Medical Library Tutorials: A Review Characteristics and Content of Medical Library Tutorials: A Review

Online tutorials can be a useful facet of a library's instructional strategies. According to the Instructional Technologies Committee of the Association of College & Research Libraries (ACRL), web tutorials should include interactive exercises such as simulations or quizzes [1]. These activities encourage active learning and allow students to respond to what is taught, while self-assessing their own learning. Web tutorials should also provide a way to contact a librarian for questions or to give feedback about the tutorial's design or usefulness [1]. 
 
While previous studies have looked for examples of active learning in tutorials, they did not focus on academic medical libraries [2, 3]. Dewald analyzed 20 tutorials (19 for post-secondary education; 1 for kindergarten–8th grade students) selected by the Research Committee of the Library Instruction Round Table of the American Library Association [3]. Hrycaj examined 65 tutorials created by member libraries of the Association of Research Libraries (ARL) [2]. Both studies emphasized the importance of including active learning in tutorials. Examples of active learning described in these articles include quizzes at the end of tutorial modules, questions integrated into the tutorial modules, exercises used in tutorial modules, quizzes requiring the use of separate browser windows, or options for sending quiz results to an instructor [2]. Dewald's 1999 study found that 37% of the tutorials included active learning features, and Hrycaj's 2005 study found that 60% of the tutorials contained some element of active learning. 
 
The purpose of the current project was to identify and analyze freely available online tutorials created by medical libraries. The project team was interested in identifying the topics of tutorials created by medical libraries, determining common design features used in tutorials, and assessing elements of active learning in the identified library-created tutorials. The team also generated a list of third-party tutorials to which libraries link.


INTRODUCTION/BACKGROUND
Online tutorials can be a useful facet of a library's instructional strategies. According to the Instructional Technologies Committee of the Association of College & Research Libraries (ACRL), web tutorials should include interactive exercises such as simulations or quizzes [1]. These activities encourage active learning and allow students to respond to what is taught, while selfassessing their own learning. Web tutorials should also provide a way to contact a librarian for questions or to give feedback about the tutorial's design or usefulness [1].
While previous studies have looked for examples of active learning in tutorials, they did not focus on academic medical libraries [2,3]. Dewald analyzed 20 tutorials (19 for post-secondary education; 1 for kindergarten-8th grade students) selected by the Research Committee of the Library Instruction Round Table of the American Library Association [3]. Hrycaj examined 65 tutorials created by member libraries of the Association of Research Libraries (ARL) [2]. Both studies emphasized the importance of including active learning in tutorials. Examples of active learning described in these articles include quizzes at the end of tutorial modules, questions integrated into the tutorial modules, exercises used in tutorial modules, quizzes requiring the use of separate browser windows, or options for sending quiz results to an instructor [2]. Dewald's 1999 study found that 37% of the tutorials included active learning features, and Hrycaj's 2005 study found that 60% of the tutorials contained some element of active learning.
The purpose of the current project was to identify and analyze freely available online tutorials created by medical libraries. The project team was interested in identifying the topics of tutorials created by medical libraries, determining common design features used in tutorials, and assessing elements of active learning in the identified library-created tutorials. The team also generated a list of third-party tutorials to which libraries link.

METHODS
Using the list of the Association of American Medical Colleges' member schools, the team identified websites Supplemental Tables 1,2,3,4,and 6 and an appendix are available with the online version of this journal.
for 124 academic medical libraries in the United States, which served as the review subjects [4]. The project team divided this list so that each team member reviewed 31 medical library websites. Each team member then searched the library sites using terms such as ''tutorials,'' ''online tutorials,'' and ''web tutorials.'' Team members also browsed the library websites to locate any reference to tutorials.
Prior to examining the library sites, the team reviewed the literature to create a checklist of common tutorial features. Using the tutorial design tips obtained from this literature review and the project team's own subjective list of effective tutorial design elements, the team generated a list of ten tutorial questions (Appendix online) to use as they accessed each medical library website. Team members identified tutorials created by the library under examination and tutorials created by third parties, a vendor or another library, to which libraries linked. If the medical library designed its own tutorials, the team member collected data about the tutorials via subsequent questions. The team repeated this process for each tutorial created by a medical library. If a library identified the resource as being a ''tutorial,'' the team counted the item as such, even when it appeared to be a simple handout or electronic presentation.
The team also evaluated elements of active learning. The team members counted the tutorial as being interactive (question 4) if the user was required to perform searches, complete exercises, or click on appropriate boxes for additional information. Tutorials that asked the patron to open up the database or software product in a new window and follow along with the steps in the tutorial were counted as interactive. Tutorials that simply required the patron to click a forward button to navigate the tutorial were not considered interactive. The team also collected data on whether a tutorial included a quiz or a test (question 5).
During the data collection phase, the team consulted each other in an attempt to remain consistent in data collecting. The team collected data from the 124 websites between the months of January and February 2007 and compiled the data in an Excel spreadsheet for analysis.

Tutorials created by third parties
Seventy-eight out of 124 library websites (63%) included links to tutorials that were created outside the library, such as by a vendor or another library (Table 1 online). Some libraries had designated sections on their websites for tutorials. In other cases, the links to tutorials were included on a Subject List page or beside links used to access databases.
Sixty-five of the 124 libraries linked to the National Library of Medicine's (NLM's) PubMed tutorial, the most commonly linked-to tutorial (

Software used to create tutorials
Some tutorials were created using more than one type of software, such as hypertext markup language (HTML) editors and electronic presentation programs (

Active learning
In most of the tutorials, the patron is passive and simply reads content or watches a demonstration of how to search a database. Seven percent of the tutorials (19/ 274) were considered interactive (  (Table 5).

Feedback
Out of the 274 tutorials observed, 66 (24%) included a survey or feedback option (Table 5). If a tutorial included a librarian's contact information or an ''Ask-a-Librarian'' link, the team counted it as providing a feedback option.

Target audiences
Two hundred and fifty-four tutorials were designed for anyone using the library and its resources (Table 6 online). The total number of tutorials addressing targeted groups was 281, as some tutorials mentioned multiple target groups. The team found tutorials designed specifically for chemistry (6 tutorials), distance education (4 tutorials), nursing (3 tutorials), first-year medical (3 tutorials), dentistry (1 tutorial), and thirdyear medical students (1 tutorial). One tutorial was geared toward faculty, and 1 tutorial was created for researchers.

Printable parts
Only 26% of the tutorials (72/274) had printable parts, such as accompanying handouts (Table 5). If the tutorial itself was formatted to be printed, such as in Word or PDF documents, the team counted the tutorial as having printable parts. The tutorials were not counted as printable if users could only print one screen of the tutorial at a time.

CONCLUSION
Some libraries created tutorials for resources or content specific to their institutions, while relying on vendor or third-party tutorials for educating users about how to search databases. The project team believes that many libraries may be choosing to link to vendor-produced tutorials instead of creating their own due to frequent interface changes. Moreover, although the majority of observed libraries had created tutorials, most of the tutorials had simplistic designs that did not require responses from the user. Most of the libraries used HTML editors to create tutorials. Screen recording software, which is easy to use, can help librarians create sophisticated tutorials with interactive elements more quickly than using HTML alone. Further, the authors believe that quizzes and/or printable parts, like handouts, are particularly helpful for tutorials that do not include a search simulation. After users watch a demonstration of searching a resource, they can print the handout to refer to while they attempt their own search. Such simple additions as printable handouts and/or follow-up quizzes would likely increase the levels of active learning for tutorial users.
While the team consulted with each other to ensure consistency during data collection, some level of collector error may have occurred. One problem the team encountered was determining whether the medical library or the larger main campus library created the tutorial. The project team attempted to collect data only on tutorials created by medical librarians. The team did not allot time to contact libraries to confirm whether the observed tutorials were created by medical librarians or by other academic librarians.
This study was designed to look only at freely available, Web-based tutorials. It is important to recognize that course-integrated tutorials and password-restricted tutorials might have more sophisticated designs. Therefore, the team could have extended the timeline of the project to request access to password-restricted tutorials, which likely would have increased the number of tutorials using active learning techniques. More libraries may be creating tutorials for specific patron groups, but access to these tutorials may also be restricted. This study might have been more systematic if the team established a definition of a tutorial beforehand; instead, the team decided to evaluate what individual libraries called tutorials.
Further research is needed to determine whether interactive library tutorials are more effective than passive ones. Literature review could be expanded to other disciplines, such as education, which may also be researching effective design elements of tutorials. More research is also needed in the area of the feedback and usage data that medical libraries are obtaining from their tutorials.
Overall, while the project team's survey of online academic medical libraries' websites revealed a large number of self-produced web tutorials, few of those tutorials incorporated active learning elements such as interactive interfaces or printable handouts. Medical libraries might want to explore incorporating such elements into tutorials to encourage learner engagement.

INTRODUCTION
Information professionals are called on to determine how best to measure the impact of an author's articles, and citation counts are often regarded as one method for obtaining a quantitative expression of the utilization and contribution of a particular published paper. As Meho states, citation analysis assumes that influential works or scientists are cited more often than others [1]. Egghe and Rousseau claim that citation counts are based on four important assumptions: an article's citation implies use of that document by the citing author; the citation reflects the merit (quality, significance, impact) of the article; the references are from the best possible works on the topic; and the cited articles are related in content to the one in which they are used [2].
Traditionally, the peer-review process has been used to assess article quality. Currently, there is a global trend toward the development, refinement, and increased use of quantitative metrics, particularly those resulting in ''quantifiable, post publication quality assessment'' [1,3,4]. However, determining impact by citation analysis can be controversial; in some cases, works are cited to point out errors and inaccuracies in the research. Additionally, long articles are often cited more frequently, and some reference lists contain erroneous citations, which can skew results. Finally, journal visibility and prestige affects dissemination, and