Abstract
Purpose: Identify journal collection access and use factors.
Setting and Subjects: University of North Carolina at Chapel Hill's Health Sciences Library patrons.
Methodology: Survey forms and user interactions were monitored once a week for twelve weeks during the fall 1997 semester. The project was based on a 1989 New Mexico State University study and used Kantor's Branching Analysis to measure responses.
Result: 80% of reported sought journal articles were found successfully. Along with journal usage data, the library obtained demographic and behavioral information.
Discussion and Conclusions: Journals are the library's most used resource and, even as more electronic journals are offered, print journals continue to make up the majority of the collection. Several factors highlighted the need to study journal availability. User groups indicated that finding journals was problematic, and internal statistics showed people requesting interlibrary loans for owned items. The study looked at success rates, time, and ease of finding journals. A variety of reasons contributed to not finding journals. While overall user reports indicated relatively high success rate and satisfaction, there were problems to be addressed. As the library proceeds in redesigning both the physical space and electronic presence, the collected data have provided valuable direction.
INTRODUCTION
When users visit the library, can they locate needed journal articles and photocopy or read the materials within their alloted time? This question was addressed by a group of staff members at the University of North Carolina at Chapel Hill's (UNC-CH's) Health Sciences Library (HSL). The User Services Coordinating Group (USCG) studied the issues, planned and implemented a journal availability study, analyzed the resulting data, and made suggestions for changes that would improve the chances for all users to have a successful experience each time they visit the library.
HSL is the primary library for the UNC-CH schools of dentistry, medicine, nursing, pharmacy, and public health and the UNC Health Care System. It also serves the health information needs of the entire university, the health professionals in the state through the Area Health Education Centers (AHEC) Library and Information Services Network, and the public. At the time of this study, the library collections contained more than 290,000 volumes, including recent and historical materials; more than 8,900 audiovisual and microcomputer software programs; and 3,952 current serial titles.
A large journal collection must be well maintained for people easily and consistently to find what they want. HSL's goal is that every user be able to find everything needed on each visit to the library, in the time allocated by the user. Although this goal would not be feasible with finite resources, HSL staff wanted to determine how close they were coming to that goal.
PROBLEMS LOCATING JOURNALS
In 1996, the Clinical Information Team, a group of HSL staff charged with determining the library needs of clinicians, indicated that there were problems with availability of library materials. The team made the following comments in a report to the Library Management Council, dated June 19, 1996:
One recurring theme from contacts our team has made with clinical groups (nurses, drug information center, dentists) is availability of library materials, particularly journals. The research team also heard about availability from the librarian at Lineberger Cancer Research Center. Typical comments include:
If we have to get materials from HSL, we have to budget at least an hour. Finding things and photocopying take time. Clinical Nurse Educators
We subscribe to journals because HSL copies are frequently in use and are hard to find. Drug Information Specialist
HSL's journal collection in my field has gone from excellent to poor. Dental Faculty
I hear complaints about the difficulty of finding journals on the shelf at HSL and the time it takes from busy schedules. Librarian, Lineberger Cancer Research Institute
HSL's journals are arranged alphabetically by title and are located on three floors. The current journals are shelved together on one floor, and bound journals are on the other two floors. Journals do not circulate for more than one hour and are off the shelf for approximately one month for the bindery process. The library uses a DRA online public access catalog (OPAC) that includes campuswide resources.
A review of journal trace statistics for January through April of 1997 indicated that forty-two of fifty-two traces were found in the library. The remaining ten traces were either owned by a library other than HSL, at the bindery, or out of the library for repairs. These traces were found within 1.1 attempts to find them. Most traces were found on the first attempt. Each member of the group could give at least one example of being contacted by a patron having difficulty finding journals that were available in the library.
The March 1997 interlibrary loan statistics revealed that there were 606 requests. Of those requests, seventy-seven were rejected, because they were either owned by the library and available for patron use (47) or available at another campus library (30). While this amount was only 13% of the total, the people requesting those articles were delayed in finding information that was readily available.
In 1997, HSL's library management asked USCG to “determine if users can get what they need at HSL, in a reasonable period of time as determined by the user, and to work toward establishing performance standards for access and availability.”
LITERATURE REVIEW
The literature showed little information about journal studies and no reported studies in the 1990s. The following comment from a 1989 journal availability study at the University of New Mexico still rang true: “Of the few availability studies undertaken, most have been confined to books; comparatively few have been devoted to periodicals” [1].
A 1980 study at Ohio State University explored the “frustration rate” for periodical literature [2]. Fifty-five percent of the 155 articles searched for were successfully retrieved. The investigators found that 45% of articles searched for by a group of students were not found because of user error (15%) or library problems (30%). Library problems included torn out articles (9%), incomplete volumes with issues and volumes received but missing (8%), volumes at the bindery (5%), volumes off the shelf (5%), and issues never received or successfully claimed (3%).
The study quoted above looked at journal availability at the University of New Mexico (UNM). The researchers used Kantor's Branching Analysis technique [3] for their study and found that of 483 searches, 269 were successful, yielding an overall success rate of 56%. The error factors considered in this study were bibliographic error or a problem with citation (31 occurrences), catalog user error (48 occurrences), circulation error (10 occurrences), library error (40 occurrences), and user error in the actual search for the item (24 occurrences).
Roberts studied journal availability in the Learning Resources Library at East Tennessee State University's Quillen-Disher College of Medicine in 1989 [4]. She also used Kantor's Branching Analysis and found that out of 297 requests, there were 162 satisfied searches and 135 dissatisfied searches. Reasons for dissatisfied searches were issues not owned by the library (95 occurrences); in circulation (5 occurrences); library malfunction such as the issue being at the bindery, waiting to be reshelved, and so forth (14 occurrences); and user error (21 occurrences). Consequently, the overall success rate was 55%.
Because the literature was so sparse, a message was sent out on the MEDLIB-L email discussion list asking if anyone had done a journal access or availability study. The only response recommended looking at the New Mexico study.
After evaluating the literature and the problem, USCG identified several possible approaches. The most relevant and useful of these approaches was to replicate the 1989 University of New Mexico (UNM) study, which modified Kantor's Branching Analysis technique to accommodate journal availability as opposed to book availability.
METHODOLOGY
In 1982, members of the Committee on the Association of Research Libraries (ARL) Statistics tested performance measures that had been developed by Paul B. Kantor, Ph.D., in four of their own libraries. These measures covered availability and accessibility of library materials, analysis of patron activity, and delay analysis of specific activities and were intended to show how efficiently libraries were performing their primary functions and how well library patrons were satisfied. The committee then recommended that Dr. Kantor prepare a manual of these objective performance measures to encourage other libraries to use them as a tool that gave directly useful, economic, and intelligible results.
Kantor's technique produces performance measures with a 95% confidence level, using various fractions to identify specific availability factors. USCG made further modifications to the survey forms developed in the UNM study to reflect its own procedures and current resources adequately to obtain data that would allow the required detail of analysis. The adaptations to the survey form included adding patron status categories, a question regarding rate of ease in locating journals, three questions about use of the online catalog, and a comments field (Appendix).
Kantor identified the following five points to be learned from a study using actual patron searches:
the items sought that the library has not acquired
the level of user skills in catalog usage
an accurate measure of circulation interference in the most active part of the collection
the success of library performance in the typical user experience
the degree of user skills in locating volumes on the shelves [5].
The overall measure of availability (MAV) is expressed by the fraction:
Other fractions include:
MAV-ACQ (acquisition): probability that the sought item has been acquired by the library
MAV-CAT (catalog): probability that the user locates an item in the catalog with accurate call number and holdings information (user skill)
MAV-CIRC (circulation): probability that an item is not checked out or circulating inhouse
MAV-LIB (library): probability that available items are in correct location (library performance)
MAV-USER (user skill): probability that the user can locate correctly shelved items (user skill).
The UNM model used “reference intervention”—help was offered if an item was not found—to assist study participants. Their library already offered intervention as a formal service. The authors of the UNM study spoke with Dr. Kantor at the preliminary stage of their study, and he indicated to them that such intervention would alter the reliability of his Branching Analysis technique, because it would change the dynamics of the model. The authors felt, however, that the Kantor model was “robust and flexible enough to accommodate this change in procedure” and that “such intervention [led] to greater precision in pinpointing causes of failure.” They also felt that including library users this way was good public relations [6].
A pilot study, using 250 surveys, was conducted during two days in August 1997. The goal of the pilot was to test how well the survey forms worked, to test the effectiveness of the method of distributing surveys, and to identify any problems or changes that needed to be made before beginning the main survey process. USCG decided that a journal location service (JLS) should be established during the survey to replicate “reference intervention,” although librarians at the reference desk routinely assisted with such inquiries.
The main survey involved more than 2,000 transactions, well above Kantor's suggested minimum of 400 transactions. One thousand and fifty-four surveys were distributed over a total period of twelve days spread throughout the fall semester in order to represent the wax and wane of the semester (Appendix). Patrons entering the library were first asked whether they would be looking for journals. If their answer was “yes,” they were invited to participate in the survey. Participation was entirely voluntary. Personnel then explained how to fill out the survey form and where to return it once items had been located. Upon returning the survey form, participants were asked if they had located what they needed and, if not, were offered help. If participants required assistance, they could use the JLS for immediate help, after having checked “Not found” on the survey form. Surveys were returned to a designated box at the distribution point near the library entrance/exit.
On weekdays, the survey was distributed during a total of seven hours per day. On Saturdays, surveys were distributed a total of six hours per day. On Sundays, surveys were distributed for four hours. On weekends, one Journal Availability Study staff person and the reference librarian distributed surveys. Based on the experiences from the pilot, USCG decided to set aside certain times of the day exclusively for assisting participants, during which no surveys would be distributed. Seven hundred and forty-seven usable surveys (70.9%) were returned containing 2,056 journal searches, an average of 2.75 searches per survey.
A flowchart for tracing the steps in the search process and a list of Search Failure Factors were adapted from the UNM model. Figure 1 shows the list of search factor codes.
The prefix “D,” created by Kantor, stands for “dissatisfaction.” Each failure (item that was not located) was given a code. The results were then summarized on a code sheet and analyzed using Kantor's Branching Analysis technique to give values for each of the availability fractions.
RESULTS
A total of 1,663, out of 2,056, journal citation searches were concluded successfully (found), which translated into an overall performance measure of 80.9%. The remaining 393 searches were unsuccessful (not found). Of these 393 unsuccessful searches, fifty were due to either bibliographic error, bad citations (DBIB), or undetermined reasons (NA). Because these items represent unavailability, but for undetermined reasons, Kantor's model prescribed a correction factor that distributed the data proportionately throughout the other categories of error. The correction factor, 1.15 for this study, was obtained by dividing the reported number of unsuccessful searches by the analyzed number of unsuccessful searches.
In Kantor's model, an availability analysis form (Figure 2) converts raw data by multiplying by the correction factor (1.15). The analysis form gives the measure of availability (MAV) fractions for the specific error categories mentioned above, plus the overall MAV, which is the product of all five fractions. The data from this form then goes into a branching diagram. The branching diagram shows individual performance factors in each of the five error categories, as well as illustrates how those components contribute to the outcome of a participant's search for a known journal.
Of the 2,056 journal citations sought, HSL owned 1,955. The calculated performance factor for this area was 94.4% (100% indicating perfect performance). Of the 1,955 citations for journals owned by HSL, twenty-six failures were due to reasons connected to catalog usage, a performance measure of 98.5%. Of the remaining 1,929 citations, fifteen were unavailable due to circulation reasons, giving a performance measure of 99.1%, the highest performance measure in this study. Of the remaining 1,914 items, 150 were unavailable because of library procedures, giving a performance measure of 90.9%. This area had the highest number of failures and the lowest performance measure.
The rest of the search items, numbering 1,764, consisted of fifty-one failures due to patrons' lack of understanding of library arrangements, shelf arrangements, or other undetermined user problems, a performance measure of 96.6%. From the remaining 1,713, the number of occurrences not analyzed (50) due to bibliographic error (Figure 1) or other undeterminable reasons was subtracted, giving 1,663 successful citation searches out of the original 2,056 items sought.
The overall performance measure is the number of successful searches divided by the number of total searches (1,663/2,056) and is obtained by multiplying all the percentages in Column D (Figure 2). These results are represented in Kantor's Branching Diagram (Figure 3). The numbers in parentheses show the adjusted figures after multiplication by the correction factor. A summary of the performance measures is given in Figure 4.
ANALYSIS
All the performance measures were above 90% and were therefore excellent (Figure 4). The lowest was MAV-LIB, the probability that available items were in the correct location due to library procedures. While 90.9% was very good, further analysis of the codes for dissatisfaction with library procedures (DLIB) showed that library procedures could still be improved. Of the 150 DLIB failures, fifty-seven (38%) were due to the item being somewhere in the intermediate reshelving process, with the majority of these being on the resorting shelves. A further fifty-one items (34%) were at the bindery. Twenty-two (15%) were missing and not known by the library; sixteen (11%) were missing and known, including items ordered but not yet received; and only four items (2.6%) were misshelved.
The second lowest ratio, at 94.4%, was MAV-ACQ, or the probability that the library owned the sought item. As HSL has a large journal collection, numbering 3,952 titles at the time of the study, it is unlikely and unreasonable to expect that this could be improved upon.
For MAV-USER, 96.6%, one third of failures were due to lack of understanding of the library or shelf arrangements on the part of the user, and the remaining failures were for undetermined reasons.
MAV-CAT, 98.5%, was the second highest ratio. All bound journals are listed in the catalog with detailed volume holding information that lists complete, incomplete, or missing items. However, determining if the library no longer receives a publication can be difficult, because the catalog does not specifically say “no longer received.”
MAV-CIRC was the highest ratio at 99.1%. The collection is quite large, and journals are only out of the library for binding or short durations. Patrons are adept at looking in photocopier rooms or on tables, if materials are not on the shelves. If journals happen to be at the bindery, an interlibrary loan (ILL) request can be initiated. Possible reasons for the previously mentioned ILL cancellations were that users did not check the catalog or misunderstood the record.
ADDITIONAL SURVEY FINDINGS PERTAINING TO UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL'S (UNC-CH) HEALTH SCIENCES LIBRARY (HSL)
Adaptations to the survey form allowed USCG to obtain a wealth of information about many aspects of HSL. Such information included how easy patrons rated their journal search, identified the most frequently sought journals and the hardest titles to find, and looked at the behavior of first-time users compared to previous users. Some items were surveyed to determine trends in library use, such as where users checked the catalog. Users were categorized by status (Figure 5) and the time taken per search, and the success rate can be calculated for each user group. USCG could construct a very detailed picture of library users, and many combinations and comparisons of results could be made. For example, a successful trip took less than five minutes to locate and photocopy an article.
Because the survey was distributed throughout the whole semester, USCG could look at journal usage over time—comparing the beginning (weeks 1–4), middle (weeks 5–8), and end (weeks 9–12) of the semester. Table 1 shows the distribution of journal searches throughout the semester, with the first three weeks showing a much higher usage than the rest of the semester.
The number of participants using the catalog crept downward over the semester, from 57% in the beginning to 49% at the end. The highest percentage of new users occurred in the middle of the semester, and more than half of these new users were undergraduates, which coincided with when their mid-term papers were due.
SURVEY COMMENTS
The written comments made by study participants also proved useful for assessing user attitudes and addressing concerns. While some participants wrote just a couple of words, others took the opportunity to air their grievances in one or more paragraphs. Comments could be roughly grouped into the following categories:
general comments about the library and helpfulness of staff
photocopiers
online catalog
reason(s) journal item could not be found
statements that the journal item was successfully found and where it was located
problems relating to journals at bindery
comments relating to specific journal
suggestions for improvements
OBSERVATIONS
General observations by the library staff reflected additional information not found in the statistics:
Staff gave more suggestions for improvements.
Users do not differentiate between the local messages statement in the UNC Literature Exchange's (UNCLE's) OVID databases and using the online catalog. Some users, in fact, did not know there was an online catalog. UNCLE is a Web-based health information service, which provides access to citation and full-text databases and contains links to more than 1,000 other health-related Internet resources. UNCLE is jointly developed and managed by HSL and the Office of Information Systems in the School of Medicine.
The decrease in sought journals during the end of the semester could be related to several factors, including that users might do research earlier in the semester and write papers later, that the end of the semester was exam time, or that most papers were due earlier in the semester. Another reason might be that users who had completed one or more surveys were more likely to decline to participate later in the semester.
Some users completed the survey many times and in great detail.
Some users were unsure about identifying their status.
There was general discontent with the photocopiers.
Users often preferred to look for an article several times rather than check the catalog or ask for assistance, having assumed the library would have the item because journals did not circulate.
People who used the catalog had difficulty determining if the library did not currently receive items.
COSTS
What are the costs of conducting a journal availability study? Excluding the group's planning time, staff members volunteered to handout surveys and assist users. In some cases, two staff members were involved at a given time. Approximately 160 staff hours were required to conduct the survey. A cross section of staff from the library director to student assistants participated after a ten-minute orientation. The one additional cost, outside the library's normal operating expenses, was a temporary graduate student, from the School of Information and Library Science, who worked ten hours a week. The student maintained an Access database, entered data, analyzed the data, and wrote the report.
RECOMMENDATIONS
The good news was that most users were able to find the journal articles they wanted within an acceptable amount of time. However, USCG made these recommendations to HSL:
look into the problems identified with photocopying
use the list of searched journals for selecting and retaining materials for the library
improve HSL signage
emphasize using the online catalog and locating hard-to-find journals during orientation sessions
consider having a note in the catalog that says “not currently received”
use the information gathered in this study in the upcoming library renovation
repeat this study at a later date to find out effectiveness of changes.
DISCUSSION
Why should librarians want to conduct a journal availability study in their libraries? First, it is an excellent public relations tool, because it lets users know librarians are interested in their needs. Especially in large libraries, users can feel that their individual needs are not important. The survey allows them to communicate what they like and dislike about resources and services either verbally or in writing.
The survey also served as an evaluative method for collections and services. It gave a clearer picture of which journals the sample population was using most often. This picture assisted with journal selection and deselection. The study provided valuable information that HSL has used to improve resources and services. Based at least partly on these recommendations, HSL has replaced old photocopiers, restructured photocopy services, revised signage, added more emphasis on locating journals during orientations, and developed an online module on finding health information. HSL formed a shelving process improvement task group. Complaints were looked at seriously and addressed. Because the study was conducted over a semester, it also provided data to help determine staffing needs for shelving.
Finally, the journal availability study serves as a benchmark for HSL and could be used by similar libraries looking at journal access. When the study is repeated, HSL will know how well this set of user issues has been addressed. There will be new wrinkles, such as electronic journals, and new issues, but success rates can be measured against this first survey.
CONCLUSION
The University of New Mexico model was easy to adapt and implement. Adding questions to the survey provided a great deal of useful information without significantly increasing the time required of the patron in filling out the form. Kantor's Branching Analysis technique provided a useful tool for evaluating journal availability and was fairly easy to apply to the study.
The benefits for HSL have been worth the cost in time and effort. The library identified why users were not able to find the specific journal articles they needed and then prioritized efforts to improve availability. The academic community appreciated the investigation of such a basic library function, and users were willing participants in the study. Through studies like this one, libraries earn respect from regular users and campus administration. Ready access to the journal collection is very important to library users, and they are impressed with attempts to improve access. In a time when libraries are being asked to measure outcomes, a journal availability study is a good one.
Appendix
Journal availability survey
REFERENCES
- Bachmann-Derthick J, Spurlock S.. Journal availability at the University of New Mexico. Advances in Serials Management. 1989;3:173–212. [Google Scholar]
- Murfin ME.. The myth of accessibility: frustration & failure in retrieving periodicals. J . Academic Librarianship. 1980;6(0):16–9. [Google Scholar]
- Kantor PB. Measurement of availability using patron requests and branching analysis. In: Objective performance measures for academic and research libraries. Washington, DC: Association of Research Libraries, 1984 [Google Scholar]
- Roberts JE. Journal availability study [master's paper]. Chapel Hill, NC: UNC-CH, 1989 [Google Scholar]
- Kantor PB. Measurement of availability using patron requests and branching analysis. In: Objective performance measures for academic and research libraries. Washington, DC: Association of Research Libraries, 1984 43–4. [Google Scholar]
- Bachmann-Derthick J, Spurlock S.. Journal availability at the University of New Mexico. Advances in Serials Management. 1989;3:210. [Google Scholar]