• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of jmlaJournal informationSubscribeSubmissions on the Publisher web siteCurrent issue of JMLA in PMCAlso see BMLA journal in PMC
J Med Libr Assoc. Apr 2004; 92(2): 218–225.
PMCID: PMC385303

What words and where? Applying usability testing techniques to name a new live reference service

Vicky Duncan, BA, MLS, Information Services Librarian1 and Darlene M. Fichter, BA, MLS, Data Library Coordinator1

Abstract

Objective: A user-focused design approach was taken to develop a new “live reference” service at the Health Sciences Library, University of Saskatchewan. The name of the service, the placement of the links, and the design of a graphical button were seen as key elements in creating service awareness and its subsequent use.

Methods: To ensure library users recognized and understood the label for the new service, selected library users were given an opportunity to choose a phrase that would best describe the service. The top two preferred phrases were then placed on the library Web pages as text and graphic images for further testing. Some pages had links in multiple locations to determine which placement worked best. Task-based usability testing was carried out with participants who were unaware of the new service. Participants were observed as they completed seven Website tasks arranged in increasing levels of difficulty to see whether they would notice the live reference service and seek assistance.

Results: The high level of recognition and use of the service indicate that the label name and link placement were effective with library Website users.

Conclusions: Using user-centered design methodology helped ensure that the new live reference service was visible and used and demonstrated the effectiveness of the user-centered design approach for adding new services to an existing Website.

INTRODUCTION

While planning the introduction of a new “live reference” service on the Health Sciences Library's Website at the University of Saskatchewan, library staff made a conscious decision to take a user-centered approach [1]. User-centered design is defined as “the practice of creating engaging, efficient user experiences” [2]. In particular, user input for labeling the new service was felt to be crucial to its success, because many patrons would not be acquainted with such a service [3]. This study undertook to test which words, phrases, and/or logos would best lead users to the live reference service. The study also sought to determine on what pages, and in which position, the links to the service should be placed, to ensure visibility and recognition.

The Health Sciences Library, located in Saskatoon, Saskatchewan, serves the colleges of medicine, dentistry, nursing, pharmacy, and nutrition and the school of physical therapy. Each college is moving to a distributed, regional education model where students will spend part of their time working outside the city. The new live reference service was intended to provide these students with immediate access to reference service.

LITERATURE REVIEW

Although many journal articles describe the introduction of live reference services in libraries, very few projects actually involved the user in the design stage of the service. A thorough search of the literature revealed only one usability study undertaken before introducing a live reference service, a study that included choosing the name of the new service. This study was done in 2000 by the National Cancer Institute, on their health information Website [4]. The study reached some interesting conclusions

  1. Users, not the developers, should choose the graphics. What the site developers thought was eye-catching and original, made little impression on the users of the site; and the graphic, although artistic, was not clicked.
  2. Important items should be placed at the top center of the page, since that is where people look first.
  3. Make it absolutely clear what is “clickable,” and provide a button to click. A graphic without a descriptive label is clicked less often than a text link.
  4. Graphics that look like banner ads tend to be ignored [5].

Many library usability studies emphasize the need to avoid library jargon and acronyms as links on the library home page, because these present major obstacles to users retrieving information from the Website [6, 7]. A link must intuitively alert users to the new service and be plainly visible on the library home page [8]. In the electronic commerce literature, Donatello points out that buttons labeled “Click Here” improve click through by 44% [9], while Nielsen and Honan advocate selecting text links with “strong scent,” that is, words meaningful to a particular audience in a particular context. [10–12]. Using the lessons learned by the National Cancer Institute and the e-commerce research and keeping in mind the importance of plain language, the project team embarked on a usability study that would assist in naming of the new service.

METHODOLOGY

This study is unique in its user-centered approach to researching user's preferences and behaviors with the introduction of a live reference service on a library Website. Two techniques were employed to gather data to name the live reference service and to ensure that the logo was visible and usable. The first approach, preference testing, gave a sample of library users a chance to consider and select text and graphic labels for the site. The second method involved hands-on usability testing of the Health Sciences Library Website.

Method 1: preference testing

Preference testing is quick and easy to carry out. Alternative labels and graphics are presented individually to participants, and each is asked to indicate the best option. Participants were given the opportunity to suggest other wording and to elaborate on the reason for their selection. Two rounds of preference testing were conducted.

The preference testing was carried out in the library over a two-week period. Users were approached in the library and asked if they would volunteer. Care was taken to ensure that there were participants from all three major user groups—faculty/staff, medical students, and nursing students—and that no one group dominated. In addition, the recruiters tried to ensure that the sample included a group that was heterogeneous—different ages, genders, and cultural backgrounds. In total, twenty volunteers were recruited to participate in two rounds of paper testing.

First paper test

As a starting point, the general layout of the National Cancer Institute's final LiveHelp logo (Figure 1) was used as a template for the usability testing. The template was composed of two parts: (1) an invitation to chat in real time with a librarian and (2) a button label inviting the user to click to initiate the chat session. Library staff involved in the usability study brainstormed possible wording choices for the two portions of the logo. The lists developed are presented in Table 1. The first phrases in columns one and two were meant as samples only, but a number of users selected the “needing help finding information” phrase as their first choice.

Figure 1
The National Cancer Institute's LiveHelp logoPrinted with permission of the National Cancer Institute.
Table thumbnail
Table 1 Library staff suggestions for labels

Nine library users were chosen randomly and asked to select their favorite invitation to chat (column 1) and their favorite button label (column 2). It was hoped that a clear winner would emerge, but that did not happen. Table 2lists the results. The votes did not add up equally to the number of library users tested, because some users liked two wording choices equally.

Table thumbnail
Table 2 Results of votes for choices of wording

Second paper test

With no clear winner in sight, shorter forms of the long phrases that were popular in the first test were selected for further testing. Twelve combinations of popular phrases were tested with eleven participants. Because twelve labels might have been overwhelming for participants to assess, the options were divided into two groups. The labels and votes are presented in Table 3. When conducting the second paper test, users were asked why they made particular choices. One user commented that he did not like the word “chat,” because he had negative associations with “chat rooms.”

Table thumbnail
Table 3 Wording choices for second paper test, part I

When testing the second group of selections with seven different users (Table 4), three users commented positively regarding the word “librarian”: “I'm confident of getting an answer”; “it's more positive and personal”; and “I like the fact that I'm chatting with a librarian.” One user explained his dislike of the term “Answers Online.” He surmised that the link would lead to a frequently asked questions (FAQ) list, rather than an online reference service. After finishing the second paper test, it was apparent that the word “answers” was misleading for many people, so it was rejected. The word “chat” had the most clarity in that all users knew what the label meant, but some did not like its association with “chat rooms.” Which phrases were left? The three short phrases that seemed to have the most appeal were: “Ask a librarian,” “Click Here,” and “Ask me NOW!” The combination that seemed to make the most sense was “Ask a librarian” as the invitation to chat, and “Click Here” as the invitation to click. Although neither expression was particularly original nor “catchy,” the second paper test provided more evidence that these labels conveyed what the service was about and how to access it.

Table thumbnail
Table 4 Wording choices for second paper test, part II

Method 2: task-based testing

The next step in the study was to carry out usability testing of the Health Sciences Website with the links and labels in place to identify usability problems. Five participants were recruited for task-based sessions. The test sample size is small but is considered representative in the usability field. Neilsen, a well-known usability expert and researcher, has convincingly shown that 80% of site-level usability problems can be identified by using as few as five participants [13]. Given that a number of library users had already provided input into the labeling and design of the live reference links, a group of five users was considered sufficient to identify sitewide usability problems with the new service.

Initially, participants for the study were recruited by sending an email request to the nursing faculty and to all first-year medical students. One nursing faculty member and one medical student volunteered. Response was slow. The other participants, a physiology professor, a nursing department staff member, and a nursing student were directly invited to participate.

The sample for this test was designed to have representatives from each of the three major user groups—faculty/staff, medical students, and nursing students. Again, a diverse socio-demographic group was targeted that reflected the library's user population.

Test sessions

Each usability session lasted an hour. The session opened with a brief overview of the testing process and an explanation of how the results would be used. All participants were given a copy of the nondisclosure agreement describing how their responses would be kept confidential. They were asked to read and sign the agreement before the testing began. The test facilitator asked a few questions to gather background information such as job title or year in the program and computer, Internet, and library experience. Next, participants were asked to complete a set of tasks using the library Website. After completing the tasks, participants were asked to complete an exit survey. The session was concluded with a short debriefing and the opportunity for participants to select a thank you gift from an assortment of treats.

Designing the Website prototype

To evaluate the results of the preference testing, a mock-up graphic based on the National Cancer Institute's logo, with the phrases “Ask a Librarian” and “Click Here” replacing “LiveHelp” and “cancer.gov,” was designed. Ideally, the graphic needed to be in a consistent location on all pages throughout the Website to ensure maximum usability. On most pages, the only place that a link could be placed was the top center part of the Web page. Recognizing the disadvantage of this location because of “banner blindness” [14, 15], team members tried to make the logo as visible as possible. Benway writes that banner blindness occurs when “people searching for specific information on the Web tend to ignore large, colorful items that are clearly distinguished from other items on the page” [16]. The link was added to the library catalog page, the WebSPIRS interface, and Electronic Journals pages in the top center location. WebSPIRS is a database front end used by the Health Sciences Library to search MEDLINE and CINAHL. Unfortunately, this top center space was not available on the Health Sciences Library home page. Instead, the Web team decided to use this opportunity to try a different position for the graphic to see if it was more effective than the top area. A prominent graphic advertising the service was placed just below the horizontal menu bar on the home page. In addition, the text link to the existing email reference service Ask a Librarian was deliberately left on the Health Sciences Library home page for comparison purposes (Figure 2).

Figure 2
Health Sciences Library, University of Saskatchewan, home page

In addition to the links on the home page, the project team strongly believed that access to live reference needed to be available on many pages of the Website, particularly those where users might experience difficulty. A number of “problem points” or areas prone to question generation were identified by reference staff. These were the Electronic Journals page, the Library Catalogue page, and the Search page of the WebSPIRS MEDLINE and CINAHL databases. A small-sized graphic was added to this area with the phrase Ask a Librarian and a Click Here button (Figure 3). To provide additional access, a gray button reading Ask a Librarian was placed underneath the Search button in the WebSPIRS databases. In the Library Catalogue, another button labeled LiveHelp was added directly adjacent to the Search button. The National Cancer Institute study had commented that the LiveHelp button directly beside the Search button was clicked on more often than any of their banner logos, and the team was curious to see if users would seek assistance at these additional search points [17].

Figure 3
Ask a Librarian and a Click Here button

“Killing two birds with one stone,” so to speak, the project team chose to conduct some usability testing of the Health Sciences Library Website as a lead-in to a situation where users would need to seek assistance. Based on Nielsen's research of discount usability engineering testing [18], the team decided that a small number of users would be able to point out the major stumbling blocks in the Website's design and layout.

Tasks

Seven questions were drafted by three library staff members familiar with both the Website and typical reference questions posed at the Health Sciences Library (Table 5). Questions one to four and question six were arranged in order of perceived difficulty, from simplest “How late is the Health Sciences Library open on Thursday evenings in the summer?” to most difficult “How would you go about finding electronic journals in nursing?” The fifth and seventh questions were intended to be difficult to answer using the Website, thus leading the test participants to seek some form of help on the site. When they were at these points in the testing, we hoped that the participants would look for a link for assistance on the Web page to request help, specifically, the Ask a Librarian—Click Here link. The point of the usability test was to determine whether the users would discover the Ask a Librarian link in the course of trying to answer questions.

Table thumbnail
Table 5 Questions asked during Website usability testing

Before initiating the testing with our five volunteers, a library staff member not involved with the project was asked to be a “test subject.” From her feedback, questions were clarified and reordered. The five participants were tested over a three-day period.

Each participant was counseled that the purpose of the exercise was to find out how well the Website was working and perhaps alert the team to problems in layout, design, and terminology. The exercise was not intended to test the participant's knowledge of the Website or Web searching. Participants were asked to read each question aloud before they started searching for the answer and to “think aloud” as much as possible, so that their reasons for selecting links and browsing particular sections of the Webpage were expressed verbally. Observers recorded the paths taken to find the information, as well as any verbal comments made or any nonverbal communication. The observers provided no assistance. Each question was timed, and a time limit of five minutes per question was used as a guideline. At this point, a participant was asked to move on to the next question.

RESULTS

Lessons learned about how users search the Website

The results of our study were remarkably consistent with the findings of the National Cancer Institute and shed light on how users search and navigate Websites.

  1. Users did not read through the site; they scanned for words that matched their particular need.
  2. Users initially looked through the first column headed “Finding Books,” then the middle column headed “Services,” then the left sidebar. They rarely looked through the right column headed “About HSL” (Figure 2).
  3. Users tended to ignore links above the main content area, especially if the links were graphic images. They expected these images to be banner ads and have, over time, learned to ignore them.
  4. Users were not familiar with library jargon such as “database” or “interlibrary loan.”
  5. Users were not familiar with abbreviations such as HSL for Health Sciences Library or E-journals for electronic journals.
  6. Users invariably stumbled when they reached the Databases and the Electronic Journals pages. They were not confident that they knew what a database was and which one to choose, and they were not sure how to access electronic journals from the Electronic Journals page.
  7. Searching WebSPIRS MEDLINE was challenging for most users, despite the fact that most participants had attended library instruction sessions.

Lessons learned about Ask a Librarian

Three out of the five users did actually click on the Ask a Librarian link. Usability studies often comment on users' reluctance to click on Help, yet, in this usability test, two users clicked on the banner logo, and another on the text link listed under Services on the home page. One participant noticed the link but commented, “I wouldn't click on that; I would persevere on my own.” This reluctance to seek help is common behavior among many Website visitors. Similar behavior was demonstrated in another usability study that evaluated the best way to offer Web-based help, in an online registration form [19].

Researchers have found that when participants needed contextual information to help complete the form, they would puzzle over it for several minutes and would have to be directly prompted by the facilitator to click the Help link. The fact that the live reference link, in one form or another, was clicked by three users and noticed by a fourth was an indicator of success for the label choices, Ask a Librarian and Click Here. Users were primarily focused on the main content area on the home page when looking for particular links. Although the text link to Ask a Librarian was clicked once, and the banner logo was clicked twice, the main content area was the area where all users searched for appropriate links. As a result, one key recommendation of the study was that a prominent text link be placed on the home page of the Health Sciences Library Website.

Other observations

  1. Participants in the study recognized the Ask a Librarian graphic and its overall purpose. Some participants were not sure if clicking on the link would lead to an email session or live chat, but none were bothered by the opening of the chat window and immediately recognized it as such.
  2. Once participants opened a chat window, they were able to easily start chatting and close the window when finished. The chat window design had not been altered from the out-of-the-box format provided by the vendor.
  3. One participant clicked on the Help button on our green shortcut bar. The Help page was an obvious location to add a link to the Ask a Librarian service.
  4. Unlike the National Cancer Institute's site, where participants noticed the small query buttons under the Search button, none of our participants noticed the buttons placed beside the Search button in WebSPIRS or the LiveHelp button next to the catalog Search button.
  5. Users did notice the Ask a Librarian link on the Electronic Journals page, where many experienced difficulties, but did not click on it.

Post-testing survey

When participants completed the seven questions, they were asked to complete a short survey about their experience with the Website. The survey supported observations from the usability study. When asked “How satisfied were you with the site?”, four participants said they were “somewhat satisfied” with the site, one was “somewhat dissatisfied,” and one declined to answer the question. Participants reiterated problems with abbreviations, problems with finding online journals, and confusion with locating relevant databases. Interestingly, comments pertaining to difficulty in locating information were felt to be due to their own lack of Website searching experience rather than problems with the Website design. The participant who declined to answer “How satisfied were you with the site?” commented “More dissatisfaction with my limited knowledge/experience in searching.” Another felt that he “must spend more time on the Website.”

DISCUSSION

Certainly, the choice of the phrase “Ask a Librarian” will not come as a surprise to many. A recent scan of the libraries (December 2002) participating in a live reference project indicated that of 128 academic libraries, 33 (39%) had chosen Ask a Librarian as the name for their service. [20] Although one writer recommends using a catchy name such as “Q and A Café” [21], the consensus seems to be to provide a link that will be easily understood, thus not creating an unnecessary “barrier” for visitors who are not aware of the “brand name” for the service [22, 23]. Pace suggests “Real Answers” would be a suitable name for a live reference service, because, like Google, it “put[s] the emphasis right where it should be, on the answer, not the question” [24]. Our study participants, however, reacted negatively to the word “answers,” associating it with static, FAQ-type answers. Perhaps over time, Google's use of “Answers” will reach widespread brand awareness and shape users' impressions of this term.

Our results indicate that if we want to promote access to our new service, we should place a text link prominently at the top of the middle column of links on the Health Sciences Library home page in the section called Services. Scanning of the first three columns of links on the home page was the single most popular searching strategy for all participants. The question of best placement, then, became obvious by witnessing the scanning behavior and searching paths taken by our users. Users were looking for particular words or phrases that matched their information needs. The large white space with a prominent live reference graphic under the green menu bar was effectively a “dead zone.” One user commented after the testing that she never looked in that area, because “it's always just ads.”

The choice of name for the service, Ask a Librarian—although popular, established, and recognizable—presented a few challenges. We already had an email reference service by the name, a service that was used on an infrequent basis. We were now introducing a live reference service, but we did not want to discourage users from telephoning, emailing, or coming in person. A decision was made to consolidate all access choices into one page, a page that would be entitled Ask a Librarian. After making this decision, we did notice that other library Websites, such as the Massachusetts Institute of Technology Libraries* and the University of California, Los Angeles Library, have also created a consolidated access page. On our home page, the link would look exactly as it had before but was elevated to the top spot under the Services heading and accented by a small question mark graphic, following the text. This small question mark graphic was designed to serve as a unifying element between the home page link and the small buttons placed on the Library Catalogue, Electronic Journal, and WebSPIRS pages that contained the same question mark symbol.

The response to the new chat service has been slow but steady. We receive a live reference question via the Ask a Librarian page once every couple of days and sometimes receive email reference questions “via” live reference when the service is not staffed. The live reference questions, both chat and email, approximately equal the number of email reference questions that are referred to us from the Main Library's Ask a Librarian email reference service. Because the new service is still officially a “trial,” widespread promotion has not taken place. Health sciences librarians demonstrate the new service at all library orientations, and it is featured in the What's New link on the home page.

CONCLUSIONS

Using a user-centered design methodology has helped to ensure that the new live reference service is visible and used. Because of the testing, we are confident that the level of use is not impeded by the name or the graphic design. Many of this study's findings echoed the findings of the National Cancer Institute: graphics should be chosen by users; banner graphics are ignored by users; Web designers must make it clear where to click; and they must place the link to the service in a prominent location like the center of the screen. Text links worked better than graphic links. Even our graphical links were designed to look like text with a clickable button.

Some areas still require more work. First and foremost, many Web pages on which the live reference links can be placed are developed outside of the Health Science Library, by a systemwide Web committee. This requires that the Health Sciences Library team forge a strong working relationship with this committee, share the results of the study, and encourage the development of a text link to the new service when the site is updated. One of the areas under our control that needs further testing is the “composite” Ask a Librarian page. The idea for this page emerged after the testing, and it would be valuable to see how users interact and make use of this page. During the final months of the pilot, we also may try varying the color and look of the small button that links to the service to see if we can improve its visibility and effectiveness.

While engaging in a user-centered design process certainly required an investment of time, the benefits outweighed the costs. Not only is the live reference pilot getting a fair test, but, by combining the task-based testing with an evaluation of the overall Website, it became clear how a variety of services work together to create the user experience. The knowledge and insight gained from the testing will help with the overall redesign of the Health Sciences Library Website. In addition, this methodology will be useful as new services and features are added over time to the site. This study, while focused specifically on live reference services, demonstrates the effectiveness of a user-centered design approach for adding new and innovative services to an existing Website.

Footnotes

* The Massachusetts Institute of Technology Libraries Help Website may be viewed at http://libraries.mit.edu/research/.

 The University of California, Los Angeles Library Help Website may be viewed at http://help.library.ucla.edu/index.cfm?linktype=text&category=main.

REFERENCES

  • Gross M, McClure C, and Lankes DR. Assessing quality in digital reference services: overview of key literature on digital reference. [Web document]. Tallahassee, FL: Information Use Management and Policy Institute, Florida State University, 2001. [cited 9 Sep 2002]. <http://dlis.dos.state.fl.us/bld/Research_Office/VRDphaseII. LitReview.doc>.
  • Garrett JJ. The elements of user experience: user-centered design for the Web. New York, NY: American Institute of Graphic Arts, 2003.
  • Foley M. Instant messaging reference in an academic library: a case study. Coll Res Libr. 2002.  Jan. 63(1):36–45.
  • Office of Communications, Communications Technology Branch, National Cancer Institute. Usability.gov: lessons learned: instant messaging. [Web document]. Bethesda, MD: The Institute. [cited 26 Apr 2002]. <http://www.usability.gov/lessons/IM_learned.html>.
  • Office of Communications, Communications Technology Branch, National Cancer Institute. Usability.gov: lessons learned: instant messaging. [Web document]. Bethesda, MD: The Institute. [cited 26 Apr 2002]. <http://www.usability.gov/lessons/IM_learned.html>.
  • Moyo L, Robinson A. Library jargon as a factor in information design for Web usability: survey report [summary]. In: 16th Annual Computers in Libraries 2001: collected presentations, 2001, Washington Hilton & Towers, March 14–16, 2001. Medford, NJ: Information Today, 2001:157–65.
  • Spivey MA. The vocabulary of library home pages: an influence on diverse and remote end-users. Inform Technol Libr. 2000.  Sep. 19(3):151–6.
  • Andrews DC, Haworth KN. Online customer service chat: usability and sociability issues. [Web document]. Washington, DC: Georgetown University. [cited 24 Jul 2002]. <http://www.aarraydev.com/commerce/jim/0203–01.htm>.
  • Donatello M. How do I click thee? let me count the ways (part 1). [Web document]. [cited 5 Feb 2003]. <http://www.naa.org/artpage.cfm?AID=1585&SID=1020>.
  • Nielsen J. Eyetracking study of Web readers. Jakob Nielsen's Alertbox [serial online]. 2000 May 14. [cited 5 Feb 2003]. <http://www.useit.com/alertbox/20000514.html>.
  • Nielsen J. Designing Web ads using click-through data. Jakob Nielsen's Alertbox. [serial online]. 2001 Sep 2. [cited 5 Feb 2003]. <http://www.useit.com/alertbox/20010902.html>.
  • Honan M. Will microads save online content? (part one) Online Journalism Review. [serial online]. [cited 5 Feb 2003]. <http://www.ojr.org/ojr/business/1015015782.php>.
  • Nielsen J. Cost of user testing a Website. Jakob Nielsen's Alertbox [serial online]. 1998 May 3. [cited 5 Jul 2003]. <http://www.useit.com/alertbox/980503.html>.
  • Benway JP, Lane DM. Banner blindness: Web searchers often miss “obvious” links. ITG Newsletter [serial online]. 1(3). Houston, TX: Internet Technical Group, Rice University. [rev. 5 Dec 1998; cited 21 Nov 2002]. <http://www.internettg.org/newsletter/dec98/banner_blindness.html>.
  • Nielsen J. The Web in 2001: paying customers. Jakob Nielsen's Alertbox [serial online]. 2000 Dec 24 [cited 21 Nov 2002]. <http://www.useit.com/alertbox/20001224.html>.
  • Benway JP, Lane DM. Banner blindness: Web searchers often miss “obvious” links. ITG Newsletter [serial online]. 1(3). Houston, TX: Internet Technical Group, Rice University. [rev. 5 Dec 1998; cited 21 Nov 2002]. <http://www.internettg.org/newsletter/dec98/banner_blindness.html>.
  • Office of Communications, Communications Technology Branch, National Cancer Institute. Usability.gov: lessons learned: instant messaging. [Web document]. Bethesda, MD: The Institute. [cited 26 Apr 2002]. <http://www.usability.gov/lessons/IM_learned.html>.
  • Nielsen J. Cost of user testing a Website. Jakob Nielsen's Alertbox [serial online]. 1998 May 3. [cited 8 Dec 2002]. <http://www.useit.com/alertbox/980503.html>.
  • Ellison M. A usability test of Web-based user assistance. [Web document]. Seattle, WA: ABC Company, 2003. [rev. 12 Mar 2003; cited 17 Mar 2003]. <http://www.winwriters.com/usability_test_analysis.htm>.
  • Francoeur S. Index of chat reference services sorted by library type: academic libraries. [Web document]. [rev. Aug 2002; cited 27 Dec 2002]. <http://pages.prodigy.net/tabo1/chatlibrarytypes.htm>.
  • Kawakami AK. Delivering digital reference. NetConnect supplement to Library Journal [serial online]. 28–29. Cahners Business Information, 2002 Apr 15. [cited 12 Nov 2002]. <http;//libraryjournal.reviewsnews.com/index.asp?layout=article&articleid=CA210717&publication=libraryjournal>.
  • Card SK, Pirolli P, van der Wege M, Morrison JB, Reeder RW, Schraedley PK, and Boshart J. Information scent as a driver of Web behavior graphs: results of a protocol analysis method for Web usability. [Web document] Palo Alto, CA: Xerox Palo Alto Research Center. [cited 15 Jan 2003] <http://216.239.53.100/search?q=cache:goaXf0TqJfoC:www.parc.xerox.com/istl/projects/uir/pubs/pdf/UIR-R-2000-13-Card-CHI2001-WWWProtocols.pdf+information+scent&hl=en&ie=UTF-8>.
  • Gonzalez A. Hot on the scent of information. Wired News [serial online]. 2001 Jun 8. [cited 15 Jan 2003]. <http://www.wired.com/news/technology/0,1282,44321,00.html>.
  • Pace AK. Virtual reference: what's in a name? Comput Libr [online]. 23(4):55–6. Available: Infotrac/Expanded Academic ASAP/A99380322 [4 Apr 2003].

Articles from Journal of the Medical Library Association : JMLA are provided here courtesy of Medical Library Association
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Links

  • PubMed
    PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...