Skip to main content
Journal of the Medical Library Association : JMLA logoLink to Journal of the Medical Library Association : JMLA
. 2004 Jan;92(1):83–90.

Inventory of research methods for librarianship and informatics

Jonathan D Eldredge 1
PMCID: PMC314107  PMID: 14762467

Abstract

This article defines and describes the rich variety of research designs found in librarianship and informatics practice. Familiarity with the range of methods and the ability to make distinctions between those specific methods can enable authors to label their research reports correctly. The author has compiled an inventory of methods from a variety of disciplines, but with attention to the relevant applications of a methodology to the field of librarianship. Each entry in the inventory includes a definition and description for the particular research method. Some entries include references to resource material and examples.

Librarians use a variety of research methods to make decisions and to improve performance. Research can be broadly defined as the “careful, systematic, patient study and investigation in some field of knowledge, undertaken to discover or establish facts or principles” [1]. This article defines and describes the rich variety of research designs found in librarianship and informatics practice. Familiarity with the range of methods and the ability to make distinctions between those specific methods can enable authors to label their research reports correctly. The author has served as a judge for the Medical Library Association (MLA) Research Section Award competition nearly every year since its inception in 1996, served two terms as chair of the MLA Research Section, reviewed abstracts for poster and paper submissions for the MLA annual meetings, and has conducted an extensive handsearch review of the health sciences library literature [2]. These sets of experiences have revealed that: (1) many authors of research reports do not label their communications as “research” even though their reports match the definition of research above; and (2) authors frequently mislabel the actual methods used in their research reports. These non-labeling and mislabeling practices cause potential confusion for colleagues searching for the evidence upon which they need to base important decisions.

The author has compiled the inventory in this article from a variety of disciplines but with attention to the relevant applications of a methodology to our own field. Every entry adheres to the aforementioned broad definition of research. Most entries in this article offer resources and noteworthy examples to facilitate the research reporting process. All methodologies share the common purpose of answering pragmatic questions about how we can make decisions to improve our practice, a fundamental goal of evidence-based librarianship and informatics practice.

Traditionally, health sciences library and informatics research has relied heavily upon case study, program evaluation, and survey research methodologies to answer important questions [3–19]. The situation appears to be changing dramatically. During the past decade, our profession has branched out into using experimental, observational, and qualitative methodologies. This article reflects the wide range of methodologies that are available now to health sciences librarians and informaticists.

The extent to which an effort adheres to the goals of open inquiry, validity, reliability, and reduction of biases often delineates whether we label an activity as research. Some methods listed in this inventory admittedly are more effective than others at reducing bias while still complying with the goals of validity and reliability [20, 21]. Unfortunately, some research reports have relied upon erroneously conducted forms of the case study method or have broadly interpreted their investigation as a “program evaluation,” reporting results as transparent forms of self-congratulation rather than offering genuine reflection or insight on what might be accurately learned from an experience. In this connection, Losee and Worley note that “There is a tendency among information professionals to write and publish in the ‘How I done it good’ genre” [22]. Yet, many times such reports simply need to incorporate valid and reliable measures to overcome these deficiencies. In addition, these genres of research reports need to include descriptions of both successes and failures to establish credibility. The authors of overly boastful case studies or program evaluations need to identify and isolate any promotional claims, reserving such elements for other venues intended to justify their programs' existence with funding agencies. Authors of research reports should always offer balanced accounts in the presentation of their results.

Each entry in this article includes a definition and description for each inventory item. In addition, some entries include: “Resource,” a guide for readers on conducting studies using the specific method; and “Example,” an illustration of the method in practice.

ANALYSIS

Analysis refers very generally to “a detailed examination of anything complex . . . made in order to understand its nature or to determine its essential features” [23]. Sometimes “analysis” refers to a variation of this process as viewed through the perspective of a certain philosophy or ideology. In other instances, an analysis searches for meaningful patterns or trends.

Example

AUDIT

In our field, this term often refers to a management, marketing, or quality audit. The management audit refers to reviewing multiple variables in the performance of either an organization or a department within a larger organization to identify both strengths and weaknesses according to strict criteria formulated in advance. “The audit design takes a written statement about what people should do, such as a protocol or plan, and compares it to what they actually do” [24]. Unfortunately, the audit has been misused in the past as a means to harass or to force certain personnel from their positions, which has led many to form a negative association with this term. The goal of the audit always should be to increase efficiencies and improve overall performance.

Example

  • Wakeley PJ, Poole C, Foster EC. The marketing audit: a new perspective on library services and products. Bull Med Libr Assoc 1988 Oct;76(4):323–7.

AUTOBIOGRAPHY

A biography written by the individual himself or herself is an autobiography. Sometimes these follow themes from the author's life, while others might only focus upon a segment of or an episode from an individual's life experience.

Example

  • Braude RM. A medical librarian's progress. Bull Med Libr Assoc 1998 Apr;86(2):157–65.

BIBLIOMINING

See “Data Mining.”

BIOGRAPHY

A narrative account of a noteworthy individual's life constitutes a biography. A biography might focus upon the entire span of the subject's life, follow a thematic thread, or focus on a segment or perhaps even a single episode in the life experience of the subject.

Examples

  • Fulton J. Holly Shipp Buchanan, president, Medical Library Association 1987/88. Bull Med Libr Assoc 1987 Jul;75(3):264–7.

  • Poland UH. Erika Love, president, Medical Library Association 1978/1979. Bull Med Libr Assoc 1978 Jul;66(3):357–9.

  • Robinson JG. Linda A. Watson, Medical Library Association president, 2002–2003. J Med Libr Assoc 2002 Jul;90(3):345–8. <http://www.pubmedcentral.nih.gov/tocrender.fcgi?action=archive&journal=93>.

CASE STUDY

The case study represents one of the most popular research methods, not only in our own field but also in the social, policy, and management sciences. In our field, the case study describes and analyzes the author's experiences with a process, group, innovation, technology, project, population, program, or organization. Yin defines a case study as investigating a contemporary phenomenon in its real-life context, when the boundaries between the context and the phenomenon are not well understood, and utilizing multiple sources of evidence. The case study has been widely used to answer questions of how or why events occurred as reported. A well-conducted case study should explicitly state, prior to beginning the research: the questions posed in search of answers, any propositions, the unit or units of analysis, the logic for linking data to any propositions, and the criteria for interpreting the findings [25]. Many criticisms of case studies have centered on the unbalanced reporting styles of authors who depict an experience in an overly negative or positive light. Even the most laudatory case studies should include negative outcomes as “lessons learned” to lend greater balance to the reporting style.

Resource

  • Yin RK. Case study research: design and methods. Newbury Park, CA: Sage Publications, 1989.

Examples

  • Ellis LS. The establishment of an academic health sciences library in a developing country: a case study. Bull Med Libr Assoc 1991 Jul;79(3):295–301.

  • Tennant MR, Miyamoto MM. The role of medical libraries in undergraduate education: a case study in genetics. J Med Libr Assoc 2002 Apr;90(2):181–93. <http://www.pubmedcentral.nih.gov/tocrender.fcgi?action=archive&journal=93>.

CITATION ANALYSIS

See “Descriptive Survey.”

COHORT DESIGN

The cohort design has been used extensively in librarianship, particularly in the collection resources development and library or informatics instruction specialties. The cohort design in the form of a book, journal, or Website use study has been frequently used to assess past performance and to predict future use patterns [26]. Yet, authors rarely use the label “cohort” when describing their methods. A cohort study essentially tracks over time a defined population that shares a set of common characteristics as it encounters the possible intended or unintended exposure to a phenomenon and any subsequent observable change in the population putatively brought about by the exposure. A cohort of students can be assessed in their information literacy, exposed to library or informatics instruction, then assessed again afterward for any improved knowledge or skills. A book usage study, as another example, follows a population with access to a books collection (exposure) over time to determine changes in the population in the form of usage. When a cohort study begins to collect relevant observable data prior to the exposure of a population, researchers define it as a “prospective” cohort study. When the study begins to collect data following the exposure, researchers refer to it as a “retrospective” cohort study. When multiple measurements are taken at regular intervals within the cohort study, researchers refer to it as a “longitudinal” cohort study [27].

Resource

Examples

  • Blecic D. Monograph use at an academic health sciences library: the first three years of shelf life. Bull Med Libr Assoc 2000 Apr;88(2):145–51. <http://www.pubmedcentral.nih.gov/tocrender.fcgi?action=archive&journal=72>.

  • Marshall JG, Fitzgerald D, Busby L, Heaton G. A study of library use in problem-based and traditional medical curricula. Bull Med Libr Assoc 1993 Jul;81(3):299–305.

COMPARATIVE STUDY

A comparative study consists of any systematic effort to find similarities and differences between two or more observed phenomena. This broad label encompasses a number of more specific research methods. Generally speaking, a comparative study in research in the social sciences identifies the common elements in two or more phenomena in search of distinct variables that explain their differences [28]. The book Summing Up might be helpful to some researchers trying to compare multiple studies [29]. In our field, comparisons and contrasts normally are used for making evaluations of performance between different projects or resources.

Examples

  • Stone VL, Fishman DL, Frese DB. Searching online and Web-based resources for information on natural products used as drugs. Bull Med Libr Assoc 1998 Oct;86(4):523–7.

  • Hallett KS. Separate but equal? a system comparison study of MEDLINE's controlled vocabulary MeSH. Bull Med Libr Assoc 1998 Oct;86(4):491–5.

CONTENT ANALYSIS

Content analysis maps nonnumerical artifacts such as text into a matrix of statistically manipulated symbols [30, 31]. “By means of content analysis a large body of qualitative information may be reduced to a smaller and more manageable form of representation” [32]. For example, content analysis might be used to analyze political discourse to identify the number of times and in what contexts speakers use a term such as “freedom.” Political scientists might then speculate on the motives or shared meanings such terms as “freedom” can invoke. A clever use of content analysis (referenced below as an example) appeared on a poster at MLA '03, when two researchers mapped the associations between clothing styles and librarians on eBay as a method for tracking the image of librarians.

Example

  • Gilbert C. MLA papers and posters win awards. Hypothesis 2003 Summer;17(2):1,5–6. <http:// research. mlanet. org>.

DATA MINING

Data mining involves the “discovery of meaningful patterns from low-level data using automated methods such as statistical or artificial intelligence tools. The data mining and data warehousing process for libraries is known as bibliomining” [33].

Resources

  • Nicholson S. The bibliomining process: seeking behavioral patterns for library management using data mining. Paper presented at: Improving Practice Through Research: Evidence Based Librarianship 2003 International Conference; Edmonton, AB, Canada; June 2003.

  • Nicholson S. Bibliomining. [Web document]. [rev 3 Sep 2003; cited 3 Oct 2003]. <http://www.bibliomining.com>.

Example

  • Kostoff RN, del Río JA, Humenik JA, Garcia EO, Ramírez AM. Citation mining: integrating text mining and bibliometrics for research user profiling. J Am Soc Info Sci Tech 2001 Nov;52(13):1148–56.

DELPHI METHOD

The delphi method seeks to assist a group to make a desired, consensus-based decision. It relies upon the anonymity of the participants' responses to questions over a succession of iterations to reach a quantitative group decision and normally involves well-informed individuals or experts on a subject of interest [34]. This method enables the facilitator to help the group avoid “groupthink” and the dominance of certain members with an agenda [35]. Kirkwood et al. used the delphi method to identify applied research questions of greatest importance to nurses in Scotland. The applied research questions given highest priority in descending order from these nurses were recruitment and retention of quality personnel, handling personnel stress and morale issues, training and continuing education of personnel, and infection control [36].

Example

  • Kirkwood M, Wales A, Wilson A. A delphi study to determine nursing research priorities in the North Glascow University Hospitals NHS Trust and the corresponding evidence base. Health Inform Libr J 2003 Jun;20(Suppl 1):53–8.

DESCRIPTIVE SURVEY

Surveys can be employed as part of a larger observational or experimental methodology such as a cohort study or a randomized controlled trial. A descriptive survey, by contrast, typically seeks to ascertain respondents' perspectives or experiences on a specified subject in a predetermined structured manner. Citation analysis represents a variation of the descriptive survey method.

Resource

  • Fink A. The survey kit. (9 volumes). Thousand Oaks, CA: Sage Publications, 1995.

Examples

  • Association of Academic Health Sciences Libraries. Annual statistics of medical school libraries in the United States and Canada. 25th ed. Seattle, WA: Association of Academic Health Sciences Libraries, 2003.

  • Reed KL. Citation analysis of faculty publication: beyond Science Citation Index and Social Science Citation Index. Bull Med Libr Assoc 1995 Oct;83(4):503–8.

FOCUS GROUP

The focus group method generates “data or information, within the small group setting, which, when analyzed, can help in: planning; making decisions; evaluating programs, products, or services; developing models or theories; enriching the findings from other research methods; and constructing questionnaires for further data gathering.” Focus groups “gather data on the opinions, knowledge, perceptions and concerns of small groups of individuals about a particular topic.” This method also “encourages people to express their views in a way that other methods cannot” [37].

Resource

  • Glitz B. Focus groups for libraries and librarians. New York, NY: Medical Library Association and Forbes, 1998.

Example

GAP ANALYSIS

Gap analysis involves surveys that seek to detect discrepancies, or gaps, between customer expectations of an organization and that organization's ability to deliver on those expectations. Originally pioneered in the private sector as the SERVQUAL™ instrument [38], it was later adapted by librarians at the University of Texas Southwestern Medical Center Library in Dallas [39]. Due to the limitations of the original SERVQUAL instrument to identify meaningful gaps, it has since been adapted further to become LibQUAL™, administered by the Association of Research Libraries [40].

Example

HISTORY

This method seeks to recreate “a real past as it had actually occurred although there are a number of schools of thought among historians that dispute this goal” [41]. Historical research attempts to reveal cause and effect relationships between events. Any evidence related to the hypotheses that the researcher has gathered and presented should be balanced and credible [42].

Example

  • Braude RM. The Research Section of the MLA: the first fifteen years 1982–1997. Hypothesis 1998 Summer;12(2):9–16. <http://research.mlanet.org>.

LONGITUDINAL STUDY

See “Cohort Study.”

META-ANALYSIS

Meta-analysis allows reviewers to combine identical or comparable data sets from two or more studies examining the same research question to create a larger pool of data results to strengthen an overall conclusion. Rosenthal discusses the theoretical bases and limitations of meta-analysis in his standard methods book on the subject [43]. There are no known published meta-analyses central to our field. The increasing number of published cohort studies and randomized controlled trials, which would serve as the foundation for any meta-analysis, suggests that one will appear within the next few years.

Resources

  • Glass GV, McGaw B, Smith ML. Meta-analysis in social research. Newbury Park, CA: Sage Publications, 1981.

  • Petitti DB. Meta-analysis, decision analysis, and cost-effectiveness analysis: methods for quantitative synthesis in medicine. New York, NY: Oxford University Press, 2000.

NARRATIVE REVIEW

For many years, what we have called a “review article” actually has meant a narrative review. This type of review consists of an expert conducting a literature review on a broadly defined subject and then writing an introductory overview of the subject, followed usually by a description of current research or controversies at the boundaries of what is understood on the subject. The narrative review has been criticized in recent years for its subjectivity and its tendency to not have a complete scientific basis [44, 45]. Narrative reviews frequently offer readers concise introductions to broad subjects, however.

Example

PARTICIPANT OBSERVATION

Participant observation “involves the active engagement of the researcher with the members of the community that he or she wishes to study, typically as an equal member of the group” [46]. The actual methods and extent of involvement in the community vary widely [47]. “The balance between participation and observation varies, depending on the researcher and the site. The goal of the research is to understand the situation from the perspective of the participants” [48].

Resource

  • Glesne C. Becoming qualitative researchers. 2nd ed. New York, NY: Longman, 1999:43–66.

Example

PROGRAM EVALUATION

Program evaluation occurs on a daily basis in our profession with varying degrees of rigor. Weiss defines program evaluation as the “systematic assessment of the operation and/or the outcomes of a program or policy, compared to explicit or implicit standards, in order to help improve the program or policy” [49]. Program evaluation can be conceptualized along a continuum from its formative type to its summative type. Formative evaluation focuses on the program evaluation as the program evolves. Summative evaluation focuses on the outcomes toward the end or at another critical juncture in determining the future direction of a program. Some have argued that true program evaluation offers a “way of gathering comparative information so that results from the program being evaluated can be placed within a context for judgment of their size and worth . . . helping the evaluator to predict how things might have been had the program not occurred or if some other program had occurred instead” [50].

Resources

  • Burroughs CM, Wood FB. Measuring the difference: guide to planning and evaluating health information outreach. Seattle, WA: National Network of Libraries of Medicine Pacific Northwest Region; Bethesda, MD: National Library of Medicine, 2000.

  • Joint Committee on Standards for Educational Evaluation. The program evaluation standards: how to assess evaluation of educational programs. Thousand Oaks, CA: Sage Publications, 1994.

Example

RANDOMIZED CONTROLLED TRIAL

The randomized controlled trial (RCT) consists of a carefully defined and assembled population. Members of this population must comply with predetermined inclusion and exclusion criteria. The population is then divided randomly into a control group, which receives either the standard treatment or no treatment at all and one or more intervention groups [51]. A population might consist of regular users of a library and informatics center at a university or hospital, which excludes members of the general public. The control group might continue to have access to the library and informatics center Website, whereas the intervention group might be provided access to an experimental (possibly improved) version of the Website. Both the control group and the intervention group would then be compared in terms of how easy they found either version of the Website for finding needed information.

Resource

  • Eldredge JD. The randomized controlled trial design: unrecognized opportunities for health sciences librarianship. Health Inform Libr J 2003 Jun;20(Suppl 1):34–44.

Examples

  • Bradley DR, Rana GK, Martin PW, Schumacher RE. Real-time, evidence-based medicine instruction: a randomized controlled trial in a neonatal intensive care unit. J Med Libr Assoc 2002 Apr;90(2):194–201. <http://www.pubmedcentral.nih.gov/tocrender.fcgi?action=archive&journal=93.>.

  • Haynes RB, Ramsden MF, McKibbon KA, Walker CJ. Online access to MEDLINE in clinical settings: impact of user fees. Bull Med Libr Assoc 1991 Oct;79(4):377–81.

  • Marshall JG, Neufeld VR. A randomized trial of librarian educational participation in clinical settings. J Med Educ 1981 May;56(5):409–16.

  • Rosenberg WMC, Deeks J, Lusher A, Snowball R, Dooley G, Sackett D. Improving searching skills and evidence retrieval. J Royal College of Physicians of London 1998 Nov–Dec;32(6):557–63.

SUMMING UP

This method actually refers to a cluster of methods described in the book Summing Up. This book anticipated meta-analysis but in situations in which researchers cannot pool similar or identical data sets. Exploratory, qualitative-oriented research questions frequently require research based upon methodologies that meta-analysis cannot synthesize [52]. This book also offers numerous illustrative examples.

Resource

  • Light RJ, Pillemer DB. Summing up: the science of reviewing research. Cambridge, MA: Harvard University Press, 1984.

SYSTEMATIC REVIEWS

Oftentimes, the systematic review occupies the highest level of evidence due to its ability to minimize bias while integrating multiple research studies. “Systematic reviews are concise summaries of the best available evidence that address sharply defined clinical questions …systematic reviews use explicit and rigorous methods to identify, critically appraise, and synthesize relevant studies. . . . (They are) scientific investigations in themselves, with pre-planned methods and an assembly of original studies as their ‘subjects’” [53]. Systematic reviews can synthesize quantitative or qualitative research studies. Some secondary techniques that can be harnessed in a systematic review for synthesizing results, once the rigorous scientific-based literature review has been completed, might be found in Light and Pillemer's classic work Summing Up [54].

Resource

  • Mulrow C, Cook D, eds. Systematic reviews: synthesis of best evidence for healthcare decisions. Philadelphia, PA: American College of Physicians, 1998.

Examples

  • Brettle A. Information skills training: a systematic review of the literature. Health Inform Libr J 2003 Jun;20(Suppl 1):3–9.

  • Winning MA, Beverly CA. Clinical librarianship: a systematic review of the literature. Health Inform Libr J 2003 Jun;20(Suppl 1):10–21.

UNOBTRUSIVE OBSERVATION

This research method recognizes the possibility that people will behave differently when they know they are part of a research study (Hawthorne Effect [55]) or under the direct observation of a physically present researcher. “Unobtrusive research attempts to study human actions and preferences without the act of studying subjects causing them to change or misreport those actions or preferences” [56]. An entire research tradition involving unobtrusive observation, most of it outside health sciences librarianship and informatics, related to the accuracy and quality of reference services has caused controversy in the past [57]. Some of these studies also have raised ethical concerns.

Resources

  • Allen B. Evaluation of reference services. In: Allen BL, ed. Reference and information services, an introduction. 3rd ed. Colorado Springs, CO: Libraries Unlimited, 2001:245–64.

  • Crews KD. The accuracy of reference service: variables for research and implementation. Libr Inf Sci Res 1988 Jul;10:331–55.

  • Losee RM, Worley KA. Research and evaluation for information professionals. San Diego, CA Academic Press, 1993:147–50.

CONCLUSION

Health sciences librarians and informaticists are utilizing a wider array of research methods than in the recent past. This inventory reflects the expansion of research methodologies beyond the case study, program evaluation, and survey methods. It will be interesting to compile another inventory of research methods in a decade to compare the changes with this 2004 article. Similarly, it will be useful to describe the distribution among different research methods used by health sciences librarians and informaticists in ways resembling past studies referenced earlier in this article. Regardless of the specific methodologies researchers might use, what no doubt will remain the same in the future will be the need to match a research method appropriately to the question posed while keeping attentive to issues of validity, bias, and reliability.

Acknowledgments

The author thanks Scott Nicholson for his guidance on the data mining entry, Catherine Burroughs for her suggestions on the program evaluation entry, Michelynn McKnight for her advice on the participant observation method, and Carolyn Reid for her close reading of the final draft of the manuscript.

REFERENCES

  1. Goldman J. ed. Webster's new world dictionary. 3rd college ed. Springfield, MA: Webster's, 1992:1141. [Google Scholar]
  2. Eldredge JD. Evidence-based librarianship: an overview. Bull Med Libr Assoc. 2000 Oct. 88(4):289–302.<http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=35250>.  [PMC free article] [PubMed] [Google Scholar]
  3. Atkins SE. Subject trends in library and information science research, 1975–1984. Libr Trends. 1988 Spring. 36(4):633–58. [Google Scholar]
  4. Peritz BC.. The methods of library science research: some results from a bibliometric survey. Libr Res. 1980–81;2:251–68. [Google Scholar]
  5. Nour MM.. A quantitative analysis of the research articles published in the core library journals of 1980. Libr Inf Sci Res. 1985;7(3):261–73. [Google Scholar]
  6. Feehan PE, Gragg WL, Havener WM, Kester DD.. Library and information science research: an analysis of the 1984 journal literature. Libr Inf Sci Res. 1987;9(3):173–85. [Google Scholar]
  7. Enger KB, Quirk G, Stewart JA.. Statistical methods used by authors of library and information science journal articles. Libr Inf Sci Res. 1989;11(1):37–46. [Google Scholar]
  8. Jarvelin K, Vakkari P.. Content analysis of research articles in library and information science. Libr Inf Sci Res. 1990;12(4):395–421. [Google Scholar]
  9. Buttlar L. Analyzing the library periodical literature: content and authorship. Coll Res Libr. 1991 Jan. 52(1):38–53. [Google Scholar]
  10. Crawford GA. The research literature of academic librarianship: a comparison of College & Research Libraries and Journal of Academic Librarianship. Coll Res Libr. 1999 May. 60(3):224–30. [Google Scholar]
  11. Rochester M, Vakkari P.. International LIS research: a comparison of national trends. IFLA J. 1998;24(3):166–75. [Google Scholar]
  12. Cheng H.. A bibliometric study of library and information research in China. Asian Libr. 1996;5(2):30–45. [Google Scholar]
  13. Nkereuwem EE.. Accrediting knowledge: the ranking of library and information science journals. Asian Libr. 1997;6(1/2):71–6. [Google Scholar]
  14. Olorunisola R, Akinboro EO.. Bibliographic analysis of articles: a study of African Journal of Library, Archives and Information Science, 1991–1997. Afri J Libr Arch & Inf Sci. 1998;8(2):151–4. [Google Scholar]
  15. Dimitroff A. Research in health sciences library and information science: a quantitative analysis. Bull Med Libr Assoc. 1992 Oct. 80(4):340–6. [PMC free article] [PubMed] [Google Scholar]
  16. Burdick AJ, Doms CA, Doty CC, and Kinzie LA. Research activities among health sciences librarians: a survey. Bull Med Libr Assoc. 1990 Oct. 78(4):400–2. [PMC free article] [PubMed] [Google Scholar]
  17. Haiqi Z. Analysing the research articles published in three periodicals of medical librarianship. Intl Inf Libr Rev. 1995 Sep. 27:237–48. [Google Scholar]
  18. Haiqi Z.. A bibliometric study on articles of medical librarianship. Inf Process Manage. 1995;31(4):499–510. [Google Scholar]
  19. Mularski CA, Bradigan PS. Academic health sciences librarians' publications patterns. Bull Med Libr Assoc. 1991 Apr. 79(2):168–77. [PMC free article] [PubMed] [Google Scholar]
  20. Cook TD, Campbell DT. Quasi-experimentation: design and analysis issues for field settings. Boston, MA: Houghton Mifflin Company, 1979:37–94. [Google Scholar]
  21. Eldredge J. Evidence-based librarianship: levels of evidence. Hypothesis. 2002 Fall. 16(3):10–13.<http://research.mlanet.org>. [Google Scholar]
  22. Losee RM, Worley KA. Research and evaluation for information professionals. San Diego, CA: Academic Press, 1993:ix. [Google Scholar]
  23. Gove PB. Webster's third new international dictionary of the English language unabridged. Springfield, MA: Merriam-Webster, 1997:77. [Google Scholar]
  24. Øvretveit J, Gustafson D. Using research to inform quality programmes. BMJ. 2003 Apr 5. 326(7392):759–61. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Yin RK. Case study research: design and methods. Newbury Park, CA: Sage Publications, 1989. [Google Scholar]
  26. Eldredge J. Evidence-based librarianship: levels of evidence. Hypothesis. 2002 Fall. 16(3):10–13.<http://research.mlanet.org>. [Google Scholar]
  27. Eldredge JD. Cohort studies in health sciences librarianship. J Med Libr Assoc. 2002 Oct. 90(4):380–92.<http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=128954>.  [PMC free article] [PubMed] [Google Scholar]
  28. Berg-Schlosser D. Comparative studies: method and design. In: Smelser NJ, Baltes PB, eds. International encyclopedia of the social & behavioral sciences. New York, NY: Elsevier, 2001:2427–33. [Google Scholar]
  29. Light RJ, Pillemer DB. Summing up: the science of reviewing research. Cambridge, MA: Harvard University Press, 1984. [Google Scholar]
  30. Roberts CW. Content analysis. In: Smelser NJ, Baltes PB, eds. International encyclopedia of the social & behavioral sciences. v.4. New York, NY: Elsevier, 2001:2697–2702. [Google Scholar]
  31. Powell RR. Basic research methods for librarians. 3rd ed. Greenich, CT: Ablex Publishing, 1997:50. [Google Scholar]
  32. Smith CP. Content analysis and narrative analysis. In: Reis HT, Judd CM, eds. Handbook of research methods in social and personality psychology. New York, NY: Cambridge University Press, 2000:313–35. [Google Scholar]
  33. Nicholson S. (assistant professor, School of Information Studies, Syracuse University). Personal email communication with the author. 15 June 2003. [Google Scholar]
  34. Kerr NL, Aronoff J, and Messé LA. Methods of small group research. In: Reis HT, Judd CM, eds. Handbook of research methods in social and personality psychology. New York, NY: Cambridge University Press, 2000:160–88. [Google Scholar]
  35. Wortman PM. Consensus panels methodology. In: Smelser NJ, Baltes PB, eds. International encyclopedia of the social & behavioral sciences. v.4. New York, NY: Elsevier, 2001:2609–13. [Google Scholar]
  36. Kirkwood M, Wales A, and Wilson A. A delphi study to determine nursing research priorities in the North Glascow University Hospitals NHS Trust and the corresponding evidence base. Health Inform Libr J. 2003 Jun. 20(Suppl 1):53–8. [DOI] [PubMed] [Google Scholar]
  37. Glitz B. Focus groups for libraries and librarians. New York, NY: Medical Library Association and Forbes,1998:1. [Google Scholar]
  38. Parasuraman A, Zeithmal VA, and Berry LL. SERVQUAL: a multiple-item scale for measuring consumer perceptions of service quality. J Retailing. 1988 Spring. 64(1):12–40. [Google Scholar]
  39. Eldredge J. First annual SCC Research Award. Hypothesis. 1997 Fall. 11(3):3. <http://research.mlanet.org>. [Google Scholar]
  40. Cook C, Heath FM, and Thompson B. LibQUAL+: one instrument in the new measures toolbox. ARL. 2000 Oct. 212:4–7. [Google Scholar]
  41. Iggers GG. Historiography and historical thought: current trends. In: Smelser NJ, Baltes PB, eds. International encyclopedia of the social & behavioral sciences. v.10. New York, NY: Elsevier, 2001:6771–6. [Google Scholar]
  42. Losee RM, Worley KA. Research and evaluation for information professionals. San Diego, CA: Academic Press, 1993:155–8. [Google Scholar]
  43. Rosenthal R. Meta-analytic procedures for social research. rev. ed. Newbury Park, CA: Sage Publications, 1991. [Google Scholar]
  44. Mulrow C, Cook D. eds. . Systematic reviews: synthesis of best evidence for healthcare decisions. Philadelphia, PA: American College of Physicians, 1998. [Google Scholar]
  45. Glass GV, McGaw B, and Smith ML. Meta-analysis in social research. Newbury Park, CA: Sage Publications, 1981:22–3. [Google Scholar]
  46. Fine GA. Participant observation. In: Smelser NJ, Baltes PB, eds. International encyclopedia of the social & behavioral sciences. v.16. New York, NY: Elsevier, 2001:11073–8. [Google Scholar]
  47. Streubert HJ, Carpenter DR. Qualititative research in nursing: advancing the humanistic imperative. 2nd ed. Philadelphia, PA: Lippincott, 1999:25–6,156–7. [Google Scholar]
  48. Conrad P. Health research, qualitative. In: Smelser NJ, Baltes PB, eds. International encyclopedia of the social & behavioral sciences. v.10. New York, NY: Elsevier, 2001:6608–12. [Google Scholar]
  49. Weiss CH. Evaluation: methods for studying programs and policies. 2nd ed. Upper Saddle River, NJ: Prentice Hall, 1998:18. [Google Scholar]
  50. Fitz-Gibbon CT, Morris LL. How to design a program evaluation. 2nd ed. Newbury Park, CA: Sage Publications,1987:9. [Google Scholar]
  51. Eldredge JD. The randomized controlled trial design: unrecognized opportunities for health sciences librarianship. Health Information and Libraries J. 2003 Jun. 20(Suppl 1):34–44. [DOI] [PubMed] [Google Scholar]
  52. Eldredge J. Evidence-based librarianship: levels of evidence. Hypothesis. 2002 Fall. 16(3):10–13.<http://research.mlanet.org>. [Google Scholar]
  53. Mulrow C, Cook D. eds. . Systematic reviews: synthesis of best evidence for healthcare decisions. Philadelphia, PA: American College of Physicians, 1998. [Google Scholar]
  54. Light RJ, Pillemer DB. Summing up: the science of reviewing research. Cambridge, MA: Harvard University Press, 1984. [Google Scholar]
  55. Roethlisberger FJ, Dickson WJ. Management and the worker: an account of a research program conducted by the Western Electric Company, Hawthorne Works, Chicago. Cambridge, MA: Harvard University Press, 1939:194–9,227. [Google Scholar]
  56. Losee RM, Worley KA. Research and evaluation for information professionals. San Diego, CA: Academic Press, 1993:147–50. [Google Scholar]
  57. Crews KD. The accuracy of reference service: variables for research and implementation. Libr Inf Sci Res. 1988 Jul. 10:331–55. [Google Scholar]

Articles from Journal of the Medical Library Association are provided here courtesy of Medical Library Association

RESOURCES