Logo of jmlaJournal informationSubscribeSubmissions on the Publisher web siteCurrent issue of JMLA in PMCAlso see BMLA journal in PMC
J Med Libr Assoc. 2005 Jan; 93(1): 81–87.
PMCID: PMC545126

The librarian's roles in the systematic review process: a case study*

Martha R. Harris, MLS, MA, AHIP, Medical Research Librarian


Question/Setting: Although the systematic review has become a research standard, little information addresses the actions of the librarian on a systematic review team.

Method: This article is an observational case study that chronicles a librarian's required involvement, skills, and responsibilities in each stage of a real-life systematic review.

Main Results: Examining the review process reveals that the librarian's multiple roles as an expert searcher, organizer, and analyzer form an integral part of the Cochrane Collaboration's criteria for conducting systematic reviews. Moreover, the responsibilities of the expert searcher directly reflect the key skills and knowledge depicted in the “Definition of Expert Searching” section of the Medical Library Association's policy statement, “Role of Expert Searching in Health Sciences Libraries.”

Conclusion: Although the librarian's multiple roles are important in all forms of medical research, they are crucial in a systematic review. As an expert searcher, the librarian must interact with the investigators to develop the terms required for a comprehensive search strategy in multiple appropriate sources. As an organizer and analyzer, the librarian must effectively manage the articles and document the search, retrieval, and archival processes.


Although the systematic review has become a research standard, precious little information in the literature has addressed the searching and bibliographic responsibilities of the librarian on a systematic review team. A search recently run in MEDLINE as well as in the Cochrane Library revealed only four citations dealing with any aspect of these issues. The citations included one that described a librarian's participation in meta-analysis projects [1] and three that described librarians' search strategy design [2, 3] and filter creation [4] to identify systematic reviews. While the article by McGowan [5] in this symposium demonstrates the expert searching skills required by systematic reviews, this article provides an observational case study to chronicle a librarian's involvement, skills, and responsibilities required in each stage of a real-life systematic review.

The medical research librarian in this study works with a group of clinical investigators in the Veterans Evidence-based Research Dissemination Implementation Center (VERDICT), herein referred to as the VERDICT Investigators. The VERDICT Investigators have conducted five systematic reviews for the Agency for Healthcare Research and Quality (AHRQ), two for the Cochrane Collaboration (Cochrane Library), and one for the American College of Physicians. These reviews have addressed such diverse topics as treating depression with St. John's wort [6], treating acute maxillary sinusitis with antibiotics [7], managing chronic hypertension during pregnancy [8], and defining and managing chronic fatigue syndrome [9]. The reviews have taken an average of a year to a year and a half to complete. Systematic reviews currently in progress include Perioperative Pulmonary Risk Management and Organizational Strategies for Guideline Implementation.

To understand the multiple roles of the librarian that are demonstrated during a systematic review, it is necessary to understand why such reviews are undertaken in the first place. At the heart of the systematic review process is the concept of evidence-based medicine (EBM), which Sackett defines as “the conscientious, explicit and judicious use of current evidence in making decisions about the care of individual patients” [10]. This evidence is often disseminated by the “narrative review,” in which one or more authors synthesize the results and conclusions of a small number of publications on a given topic [11]. The drawback to narrative reviews is that they are not based on fully systematic and explicit methods for searching the literature: defining inclusion criteria for the identification of relevant articles, critically appraising and abstracting data from selected articles, and interpreting and synthesizing the evidence using meta-analysis. Thus, the traditional narrative review may be biased due to incomplete literature searching and to the influence of the authors' opinions or personal experience. The systematic review is designed to remove bias by employing a scientific methodology to comprehensively identify, critically appraise, and synthesize all of the potentially relevant literature on a given topic [11, 12].


The VERDICT Investigators conduct each systematic review according to the procedures and criteria outlined in the Cochrane Collaboration's Reviewer's Handbook [13]. These steps include:

  1. formulate the problem
  2. locate and select the studies
  3. assess study quality
  4. collect the data
  5. analyze and present the results
  6. interpret the results
  7. improve and update the results

An examination of the steps involved in conducting the AHRQ's systematic review, Management of Chronic Hypertension During Pregnancy [8], will show that the librarian's roles as an expert searcher, organizer, and analyzer form an integral part of all seven of the Cochrane steps in conducting a systematic review. This examination will also demonstrate how the expert searcher portion of the librarian's roles is a direct reflection of the key skills and knowledge depicted in the “Definition of Expert Searching” section of the Medical Library Association's (MLA's) policy statement, “Role of Expert Searching in Health Sciences Libraries” [14].

1. Formulate the problem

It is important to realize from the beginning that the quality and scope of the search strategy is the foundation on which every facet of the systematic review is built. Therefore, the first responsibility of the librarian in the pregnancy review was to actively participate in multiple meetings with the VERDICT Investigators to determine which clinical questions the review would address and what specific topics the questions would include. This personal interaction to clarify and refine the need and retrieval requirements is cited in MLA's policy on expert searching [14].

The four primary areas covered by the review along with samples of their accompanying clinical questions are listed below [8]:

  1. Efficacy data and randomized controlled trials (RCTs): What is the appropriate antihypertensive management of women with chronic hypertension before pregnancy?
  2. Data about harms: Is pharmacological treatment of mild to moderate chronic hypertension during pregnancy harmful to mothers, fetuses, and infants?
  3. Data about blood pressure risks and optimum treatment levels: What is an appropriate blood pressure level at which to treat chronic hypertension during pregnancy?
  4. Data about benefits and harms of special fetal monitoring techniques: Is the use of special fetal monitoring techniques beneficial or harmful to mothers and fetuses?

These meetings also addressed creating a set of selection inclusion and exclusion criteria that would enable the screening process to identify potentially relevant articles. The general inclusion criteria that applied to all of the clinical question areas were:

  1. RCT
  2. Participants: women of childbearing age or pregnant with mild to moderate hypertension
  3. Control group: placebo or usual care
  4. Maternal and/or fetal morbidity outcome [8]

As a result of those early discussions, the librarian ran a preliminary search strategy on Ovid's MEDLINE to get a feel for the extent of the currently available literature. Each VERDICT investigator was given a copy of the first 100 citations, which included both Medical Subject Headings (MeSH) indexing terms and available abstracts, so that the investigators could determine what additional MeSH terms and text-words were needed for the strategy. Because search strategies for systematic reviews habitually pass through multiple revisions before they are finalized, particularly whenever a review addresses a new or complex topic such as defining and managing chronic fatigue syndrome, copies of each version of the strategy were filed by date to answer any future questions concerning the use of particular terms.

The length of the search strategy depends entirely on the scope and complexity of the review. For the pregnancy review, 15 separate search strategies ranging from 33 to 77 lines in length were created to address the specific topics of the 4 review questions: treatment, fetal monitoring, drug harms, and blood pressure risk. The longest search strategy required thus far by a VERDICT systematic review was designed for Organizational Strategies for Guideline Implementation, a review currently in progress. This strategy is 284 lines in length and will be applied to 8 diseases of particular concern to the Veterans Administration. The longer the search strategy is, the greater risk for typographical errors, so the librarian must work in close collaboration with investigators familiar enough with the nomenclature to identify possible problems. Table 1 contains the finalized pregnancy search strategy that was designed for MEDLINE to search the specific topic of “Fetal Monitor.”

Table thumbnail
Table 1 “Fetal Monitor” search strategy in MEDLINE

The first twenty-five lines of the strategy consist of the standard search (hedge) that the Cochrane Collaboration employs to search for all clinical trials and specific research studies [15]. Lines one to seven form a highly sensitive strategy that includes MeSH terms and publication types associated with RCTs. Lines eight to sixteen include MeSH terms, publication types, and text-words that are not only associated with RCTs, but that could also be associated with other trials besides RCTs. Lines seventeen to twenty-five include MeSH terms and text-words that are not only associated at times with RCTs, but are more often associated with non-randomized clinical studies.

The Cochrane hedge is designed to capture “evidence about clinical effectiveness” of health care interventions [11]. This arrangement has been referred to in the literature as the “levels of evidence” or the “hierarchy of evidence” [16, 17]. The Cochrane hedge measures the strength of that evidence according to the study design (e.g., RCT, cohort, case study, expert opinion) and the sample size (e.g., a sufficiently large number of patients so that meaningful differences between treatment groups can be detected). Therefore, a systematic review documenting many well-designed RCTs would have the strongest scientific validity (level I), while a review incorporating case-series and expert opinions would have the weakest validity (level VI). The Cochrane Collaboration summarizes this hierarchy as [11]:

  • level I: systematic review of well-designed randomized controlled trials
  • level II: randomized controlled trials
  • level III: non-randomized clinical trials
  • level IV: well-designed nonexperimental studies
  • level V: opinions of respected authorities, based upon clinical evidence, descriptive studies, or reports of expert committees
  • level VI: someone's opinion

Note that the term “systematic review” is not mentioned either as a MeSH term or as a publication type in the “Fetal Monitor” search strategy. Although the term “Review” is used as a MEDLINE publication type, it usually refers to the traditional narrative review [18]. When the term “systematic review” is entered into the MEDLINE MeSH database, it maps to the publication type “Review, Academic (1991),” which the National Library of Medicine (NLM) defines as a “Work consisting of a more or less comprehensive review of the literature on a specific subject” [19]. Unfortunately, when one is designing an encompassing search strategy, “more or less comprehensive” is not good enough. A few other bibliographic databases, such as CINAHL, do at least offer a publication type for “systematic review,” and PubMed has recently developed a “search strategy used to create a systematic reviews subset” that is “intended to retrieve citations identified as systematic reviews, meta-analyses, reviews of clinical trials, evidence-based medicine, consensus development conferences, and guidelines” [20]. With these options in mind, the most reliable method to identify systematic reviews available on a given topic still continues to be the use of the Cochrane hedge in conjunction with appropriate subject indexing terms and text-words.

One final comment concerning the scientific integrity of search strategies is that both the Cochrane Collaboration and AHRQ require that all search strategies be a part of the appendixes of a published systematic review, so that others can replicate the search results. The realization that the search strategies will be published provides an extra impetus to ensure that the strategies are as encompassing and accurate as possible.

2. Locate and select the studies

Once the strategies were finalized, the librarian determined which bibliographic databases would be searched based upon the scope, date coverage, and subject content of the individual databases. Table 2 describes the databases that are always included in VERDICT systematic reviews.

Table thumbnail
Table 2 Key databases

MEDLINE is always searched first, because it is the oldest (1966) as well as the largest (over 11 million records) database, which assures that it will yield the most citations. MEDLINE now includes OLDMEDLINE (1951–1965), which is searchable through multiple sources (e.g., Ovid, PubMed, NLM Gateway) and is valuable for identifying older citations. However, these citations lack the quality of the current MEDLINE records, because their MeSH terms have never been updated and they lack abstracts.

The expert searcher realizes that MEDLINE is merely a starting point and accepts the fact that the database cannot identify all of the relevant literature published on a given topic. The amount of duplication between MEDLINE and any other bibliographical database will depend on the scope and dates the database covers, the journals it indexes, and the indexing terms it employs. Therefore, the expert searcher can expect significant duplication (60%) between MEDLINE (clinical medicine) and CINAHL (nursing), not only because the scopes of the databases are similar, but also because they employ a substantial number of the same subject indexing terms. However, the duplication between MEDLINE and a specialized database, such as PsycINFO (psychology), will be far less because their scopes and their subject indexing terms are quite different. Table 3 describes the specialized bibliographical databases that the librarian searched for the pregnancy review based on their coverage of biomedical technology (e.g., fetal monitoring equipment).

Table thumbnail
Table 3 Biomedical technology databases

The bibliographic databases used in the pregnancy review all shared certain structural similarities. The most obvious of these was that all fifteen databases could be searched by database index terms (author, subject, publication type), by text-word, or by keyword. While keyword searching is considered to be too general for general searching purposes, because it searches every field in the database record, text-word searching is valid because it searches only the title and abstract fields. It is valuable to search for concepts not yet listed in the subject index, to search for synonyms (e.g., “cancer” as a popular term for “neoplasms”) for an established subject term, or to account for possible spelling variations [21].

Each of the databases also employs a controlled vocabulary (thesaurus), the most famous being NLM's MeSH that was developed for MEDLINE to overcome the variations found in medical terminology. Subject term searching is used for searches that require very specific results. The menu options of either adding subheadings to the subject term or making the subject term topic the primary focus of the article can be used to further refine the search results. However, these options must be used with great care in the systematic review for they can eliminate potentially relevant citations by substantially restricting the yield. Publication types are index terms that identify the format (e.g., RCT, clinical trial, journal article, letter, editorial) of the citation. These types are used in the strategy to identify clinical trials as well as to eliminate citations containing non-primary data (e.g., letters, editorials) or preliminary data (e.g., case studies). The expert searcher uses a combination of subject terms, publication types, and text-words to spread the widest possible net that will still yield valid results.

The major difficulty in searching multiple databases is that the searcher must have the expertise to know the different access procedures and searching syntax required by each database. These procedures vary depending on the particular database vendor. A classic example of a difference among vendors is the truncation symbol, which is indicated by a question mark (?) in DIALOG, an asterisk (*) in NLM, and a dollar sign ($) in Ovid. A particularly vexing variation among databases is the differences in capitalization and punctuation required to perform command syntax searching. Using the author syntax as an example: Ovid's MEDLINE requires “.au.” following the author's name, PubMed requires “[au]”, while the Cochrane Library requires ”:au”. Each database also has its own unique subject term index (thesaurus), its own rules for searching (syntax), and its own style for data presentation (the search and record screens) [21]. This knowledge of database content, indexing, and record format and the appropriateness of one electronic interface over another is cited in the MLA policy on expert searching [14].

Because a systematic review seeks information from all available sources, not just from journal articles, the librarian also searched ten Web-based sources that dealt with risks to pregnancy. Table 4 provides a sample of these Web-sources.

Table thumbnail
Table 4 Web sources

For each systematic review, the VERDICT Investigators also carefully examine the bibliographies of key articles to glean (“pearl”) older published literature (the oldest pregnancy article was published in 1947), gray literature [22], and unpublished information. The ability to identify, search, and retrieve resources beyond electronically available bibliographical databases is cited in the MLA policy statement on expert searching [14].

All of the citations resulting from the database searches, Web searches, bibliographies, and additional sources were either downloaded whenever possible or hand entered into ProCite, a citation management program produced by Thomson ISI ResearchSoft. The librarian preferred to use ProCite as the central repository for the review, because each ProCite database can handle up to 100,000 records. Moreover, the ProCite search engine and global edit functions can be used in all forty-five record fields, which was invaluable for recording screening results and other necessary review statistics. Using any citation management software in such a project, however, requires great organizational skills as well as constant attention to detail to maintain the quality and integrity of the database.

3. Assess study quality

In the pregnancy review, the fifteen bibliographic database searches, plus the citations located through the Web-based sources, gray literature, and pearled article bibliographies resulted in a total of 6,228 total records, which the librarian recorded in ProCite. Of these, 650 were cited as “duplicate,” a designation indicating that the citation had been retrieved multiple times from the database searches. These records were removed from the search results, leaving a total of 5,578 citations to be screened.

ProCite was used to create a separate screening sheet for each citation. Each screening sheet contained the ProCite record number, the database searched with its unique identifier or accession number, journal source information, available abstract, subject terms, and publication type. The screening sheet also contained separate sections where the VERDICT Investigators could record their initial screening decision (“not suitable” or “yes/pull”) and could indicate if the citation fell within their assigned subject areas (e.g., “Treatment,” “Monitor,” “Risk”).

The screening sheets were then distributed to the VERDICT Investigators, who returned the completed sheets so the librarian could record the screening results in ProCite. Following the initial screening, 4,201 records were judged as “no” (did not meet initial eligibility criteria), while 1,343 were judged as “yes/ pull” (met initial eligibility criteria). The ability to remove irrelevant material from the search results and to effectively document the search and evaluation process is cited in the MLA policy statement on expert searching [14].

4. Collect the data

Once the initial screening decisions had been recorded, the 1,343 “yes/pull” citations were either pulled by the administration staff or requested through interlibrary loan by the librarian. Of the 205 interlibrary loan requests, which were generated primarily from overseas (e.g., China, France, England, Germany) publications, fourteen citations ultimately were declared “unobtainable” through either US or foreign sources.

The librarian ran periodic ProCite subject bibliographies, based on the subject areas the VERDICT Investigators had indicated on their screening sheets, to keep the investigators informed as to the current status (pull, interlibrary loan, rec'd) of their articles. The librarian then printed a full data cover sheet containing the ProCite record number (used later in the preparation of the manuscript), journal source information, abstract, subject terms, publication type, investigator subject, and article status on the front of each received article. An original copy of each received article was filed by the administration staff in a locked cabinet according to its ProCite record number, while two additional copies of each received article were distributed to assigned VERDICT Investigators for preliminary data abstraction.

5. Analyze and present the results

The quality and amount of evidence obtained through the data abstraction process determined the final status of the articles in the pregnancy review. The VERDICT Investigators excluded 1,128 abstracted articles, because the articles failed to meet the eligibility criteria, while they included only 215 studies for full data abstraction. The librarian once again recorded these final decisions for each reviewed study in ProCite.

6. Interpret the results

The librarian's role in the interpretation process of the review data was to tally the initial screening decisions and final abstraction results from ProCite to compile a flow diagram of the selection process. This diagram—reflecting the total number of records, duplicates, a breakdown of the screenings, and final abstraction results—is shown in Figure 1.

Figure 1
Flow diagram of selection process

The librarian also wrote the “Sources and Search Methods” in the pregnancy review's “Methodology” section, which presented an overview of the number of databases and nonbibliographic sources searched, the total number of the citations, the languages of the citations, the screening results, and a brief description of the four search topics. The librarian also compiled a detailed chart that described the scope and coverage of each searched bibliographic database and Web source.

The librarian then used ProCite to scan for references (indicated by ProCite record numbers) in the pregnancy manuscript to generate the final bibliography. Because a systematic review includes all available sources, rather than just journal articles, the librarian had to know the specific bibliographic formats required to record technical reports, books, book chapters, gray literature, Web documents, and unpublished information, in addition to journal articles.

7. Improve and update the results

Because the VERDICT Investigators wanted to be sure that they had not missed any newly published information, the librarian ran a search in PubMed just prior to the final write-up of the review. The search had to be a simplified version, however, because PubMed could not handle a lengthy strategy. The search was further limited to text-words only, because new citations coming into PubMed directly from publishers had not yet received any MeSH indexing terms.

Because the Cochrane Collaboration requires that Cochrane reviews be updated on a yearly basis, the librarian periodically runs update searches to revise the investigators' published Cochrane reviews, (e.g., the acute sinusitis review [7]). The Cochrane Library identifies such revised reviews in its database for systematic reviews by the statement “Date of most recent substantive amendment,” which appears just below the original publication information in the full-text record.

The completion of any quality systematic review requires a colossal team effort, with the results of the search strategy forming the core of the research findings. In the pregnancy review, the frequent and regular communication shared by the librarian and the VERDICT Investigators developed and refined comprehensive search strategies that successfully addressed all aspects of the investigators' clinical questions and that, to our knowledge, identified all potentially relevant journal citations. This same communication resulted in additional citations from non-database sources including Web documents, gray literature, unpublished information, and citations gleaned from pearling bibliographies.


Although the multiple roles of the librarian are important in all forms of medical research, they are crucial in a systematic review. In the role of an expert searcher, the librarian must possess an ability to interact with clinical investigators to identify the clinical questions and concepts required for the search. The librarian must have a solid knowledge of the process of developing a comprehensive search strategy containing recognized hedges to identify levels of evidence; knowledge of the subject content, date coverage, indexing conventions, and online record format of multiple databases; and knowledge of the appropriateness of individual databases to particular clinical questions. Because systematic reviews go beyond just journal articles, the librarian must have the ability to identify and search resources beyond electronically available published literature [14]. In the role of an organizer and analyzer, the librarian must possess highly developed skills to effectively manage the articles and to accurately document the search, retrieval, review, and archival processes [14].

Librarians involved with systematic reviews and other forms of evidence synthesis find particular satisfaction in knowing that their expertise directly contributes to the development of new treatment interventions and to the creation of new clinical practice guidelines based on the best evidence [11, 12]. They find additional satisfaction in knowing that as individual librarians develop a deeper understanding of relevant clinical issues and research methodologies, their investigators, in turn, develop an increased appreciation for their searching and organizational expertise.


* Presented in part at MLA '04, the 104th Annual Meeting of the Medical Library Association; Washington, DC; May 2004.


  • Mead TL, Richards DT. Librarian participation in meta-analysis projects. Bull Med Libr Assoc. 1995 Oct; 83(4):461–4. [PMC free article] [PubMed]
  • Dickersin K, Scherer R, and Lefebvre C. Identifying relevant studies for systematic reviews. BMJ. 1994 Nov 12; 309(6964):1286–91. [PMC free article] [PubMed]
  • Boynton J, Glanville J, McDaid D, Lefebvre C.. Identifying systematic reviews in MEDLINE: developing an objective approach to search strategy design. J Inf Sci. 1998;24(3):137–57.
  • White VJ, Glanville JM, Lefebvre C, Sheldon TA.. A statistical approach to designing search filters to find systematic reviews: objectivity enhances accuracy. J Inf Sci. 2001;27:357–70.
  • McGowan J. Systematic reviews need systematic searchers. J Med Libr Assoc. 2005 Jan; 93(1):44–80. [PMC free article] [PubMed]
  • Linde K, Mulrow CD. St John's wort for depression (Cochrane Review). [Web document]. In: The Cochrane Library. 2004(3). Chichester, UK: John Wiley & Sons, 1998 Jul 9. [rev. 28 May 2003; cited 27 Aug 2004]. <http://www.update-software.com/clibng/cliblogon.htm>.
  • Williams JW, Aguilar C, Makela M, Cornell J, Holleman DR, Chiquette E, and Simel DL. Antibiotics for acute maxillary sinusitis (Cochrane Review). [Web document]. In: The Cochrane Library. 2004(3). Chichester, UK: John Wiley & Sons, 2001. [rev 26 May 2002; cited 27 Aug 2004]. <http://www.update-software.com/clibng/cliblogon.htm>.
  • Mulrow CD, Chiquette E, Ferrer RL, Sibai BM, Stevens KR, Harris M, Montgomery KA, and Stamm K. Management of chronic hypertension during pregnancy. Rockville MD: Agency for Healthcare Research and Quality, 2000 Aug. (Evidence report/technology assessment, no.14.). [PubMed]
  • Mulrow CD, Ramirez G, Cornell JE, and Allsup K. Defining and managing chronic fatigue syndrome. Rockville, MD: Agency for Healthcare Research and Quality, 2001 Oct. (Evidence report/technology assessment, no. 42.). [PubMed]
  • Sackett DL, Rosenberg WMC, Gray JAM, Haynes RB, and Richardson WS. Evidence based medicine: what it is and what it isn't. BMJ. 1996 Jan; 312(7023):71–2. [PMC free article] [PubMed]
  • Light K. The Cochrane Library: self-training guide and notes. [Web document]. York, UK: Centre for Reviews and Dissemination, University of York, 2003 Nov(4). [cited 27 Aug 2004]. <http://www.york.ac.uk/inst/crd/clibsec1.pdf>.
  • Cook DJ, Mulrow CD, and Haynes RB. Systematic reviews: synthesis of best evidence for clinical decisions. Ann Intern Med. 1997 Mar; 126(5):376–80. [PubMed]
  • Alderson P, Green S, and Higgins JPT. eds. Cochrane reviewers' handbook. 4.2.2 vers. [Web document]. [rev. Dec 2003; cited 27 Aug 2004]. <http://www.cochrane.dk/cochrane/handbook/hbook.htm>.
  • Medical Library Association. Medical Library Association policy statement: role of expert searching in health sciences libraries. [Web document]. Chicago, IL: The Association, 2003. [rev. 3 Sep 2003; cited 27 Aug 2004]. <http://www.mlanet.org/pdf/expert_search/policy_expert_search.pdf>.
  • Mulrow CD, Oxman A. How to conduct a Cochrane systematic review. 3.0.2 ed. San Antonio, TX: Cochrane Collaboration, 1997.
  • Evans D. Hierarchy of evidence: a framework for ranking evidence evaluating healthcare interventions. J Clin Nurs. 2003 Jan; 12(1):77–84. [PubMed]
  • Brighton B, Bhandari M, Tornetta P, and Felson DT. Hierarchy of evidence: from case reports to randomized controlled trials. Clin Orthop. 2003 Aug; (413). 19–24. [PubMed]
  • Shojania KG, Bero LA. Taking advantage of the explosion of systematic reviews: an efficient MEDLINE search strategy. Eff Clin Pract. 2001 Jul–Aug; 4(4):157–62. [PubMed]
  • National Library of Medicine. PubMed: MeSH database. [Web document]. Bethesda, MD: The Library. [cited 27 Aug 2004]. <http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=mesh&list_uids=68016442&dopt=Full>.
  • National Library of Medicine. Search strategy used to create the systematic reviews subset on PubMed. [Web document]. Bethesda, MD: The Library. [1 Dec 2003; cited 27 Aug 2004]. <http://www.nlm.nih.gov/bsd/pubmed_subsets/sysreviews_strategy.html>.
  • Harris MR. Searching for evidence in perioperative nursing. Semin Perioper Nurs. 2000 Jul; 9(3):105–14. [PubMed]
  • Mathews BS. Gray literature: resources for locating unpublished research. C&RL News. 2004 Mar; 65(3.). Available from: <http://www.ala.org/ala/acrl/acrlpubs/crlnews/backissues2004/march04/graylit.htm>. [cited 27 Aug 2004].

Articles from Journal of the Medical Library Association : JMLA are provided here courtesy of Medical Library Association
PubReader format: click here to try


Save items

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...


  • PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...