The age of mobile technologies is here. According to the International Data Corporation (IDC) Worldwide Quarterly Mobile Phone Tracker, in the year 2010, vendors shipped a total of 302.6 million smart phones worldwide, up 74.4% from the year before [1]. The Pew Research Center expects the number of mobile devices accessing the Internet to surpass the one billion mark by 2013 [2]. This trend is important to the medical library community. Although smart phone usage statistics by medical professionals are not available, the 5th Annual Future Physicians of America Survey (2010) from Epocrates provides a picture of usage by medical students: 34% of the users surveyed chose “mobile reference” as the first place to turn to when in need of information to solve a clinical question, and 42% of them planned to purchase a new smart phone within the next year [3].
Responding to the needs of mobile device users, the Health Sciences Library (HSL) at the University of Colorado Anschutz Medical Campus established a strategic goal to make its electronic resources and services easily available at all times via smart phones. The HSL's current website is hypertext markup language (HTML)–based and viewable on a smart phone; however, users have to zoom in to see the content. To provide a better user experience, the development team used a freely available PHP mobile device detection script that automatically redirects users to the HSL's optimized website for smart phones [4]. The site also supports the concept of “One Web,” as recommended by the World Wide Web Consortium (W3C). One Web means “making, as far as is reasonable, the same information and services available to users irrespective of the device they are using. However, it does not mean that exactly the same information is available in exactly the same representation across all devices” [5]. One Web can be accomplished simply by providing a full site link. However, in addition to the full site link to the library's complete resources, the HSL decided to provide quick links to top resources and services for immediate information needs <http://www.hslibrary.ucdenver.edu/m/>. The quick links include twelve popular electronic resources, ten commonly used services such as Ask-a-Librarian, and an events calendar, along with information about the library's hours and direction. All content was transformed to fit the screen space of a smart phone, so zooming is no longer required.
The HSL conducted a usability study comparing the optimized and the HTML-based, non-optimized mobile websites to evaluate the usefulness of the optimized site. Usability studies have been recognized as a fundamental method to evaluate products and systems. Even though usability studies may not be the most efficient technique for site evaluation, they provide a reliable quantitative estimate of users' performance as well as a measure of subjective satisfaction [6].
METHODOLOGY
International Organization for Standardization (ISO) standard ISO 9241-11 (1998) defines usability as the “extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use” [7]. To study whether the mobile optimized site was more usable than the prior non-optimized site, the following hypotheses were specified:
H1: Compared to the non-optimized site, the optimized site improves the effectiveness of information retrieval.
H2: Compared to the non-optimized site, the optimized site improves the efficiency of information retrieval.
H3: Compared to the non-optimized site, users find more satisfaction with information retrieval when using the optimized site.
Participants
For the study, an institutional review board petition was submitted and approved. A call for participation was sent to the campus academic email discussion list with a link for signing up online. Announcements were also made in faculty meetings. In all, twelve individuals (ten students, one faculty, and one staff) volunteered for the study. They all met the basic criteria: (1) owned a smart phone and (2) were familiar with desktop version of the library's website.
Design
Due to the small mobile screen and the smart phone's mobility, traditional techniques used in usability testing of desktop applications are questionable when applied to mobile applications. Building on the existing literature, Zheng and Adipat proposed a generic framework of testing methodologies for mobile applications, suggesting that either a laboratory experiment or field study should be considered depending on the objectives of the study and attributes of the product [8]. Because network connections and the physical environment, including location and people nearby, were not of concern in this study, a laboratory experiment was chosen. To collect realistic information, a smart phone rather than an emulator was selected as the platform for the study.
The usability attributes of effectiveness, efficiency, and satisfaction are generally measured by task completion rates, time to task completion, and task satisfaction scores, respectively. The standard method used to obtain a task satisfaction score is a Likert-scale questionnaire. However, this approach has some shortcomings. Because statements are usually positively phrased, respondents are more likely to agree than disagree with them, thus biasing the results. Also, the same questionnaire given at different times may yield different results [9]. One alternative to the questionnaire is the use of a guided interview tool, such as the Microsoft's desirability toolkit <http://www.microsoft.com/usability/UEPostings/DesirabilityToolkit.doc>. The kit provides 118 “product reaction cards” of 60% positive and 40% negative words such as consistent, flexible, sophisticated, inconsistent, and overbearing. Usability test participants are asked to pick cards that best describe the product or how using the product makes them “feel.” The chosen cards are then used as the basis for a guided interview. The method, piloted in Microsoft's lab studies, allows a great deal of feedback to be obtained in a short period of time. Barnum and Palmer also found that, compared with other tools, product reaction cards “unlock information regarding the user's sense of satisfaction in a more user-centered way” [10].
Ten tasks (Appendix A, online only) were designed to measure the effectiveness and efficiency of information retrieval in using the optimized and non-optimized mobile websites. Successful completion of the tasks required use of the interface but was independent of the participants' domain knowledge. The number of questions that were correctly answered by each participant was recorded as the measure of effectiveness, while the time spent completing the questions (within the ten-minute time limit) was recorded as the measure of efficiency. For the satisfaction attribute, twenty product reaction cards with equal positive and negative words were used for the study. These cards were selected by the HSL's Web Committee members, weighing the relevancy of the terms to the study. For example, “efficient” was selected, while “collaborative” was tossed. The committee felt that twenty cards was the right number for the time allowed for the guided interview.
Procedures
This study used a within-subject design, with each participant tasked on both the optimized and non-optimized sites. Four sessions were administered in sequence during the study: consent form, tutorial, formal tasks, and a guided interview. The consent form informed the participants of the right to withdraw during the procedure and the right to review the results. Participants were then given a three-minute video tutorial that introduced the general layout of the optimized website. Participants were not given any instruction on the non-optimized site because they were expected to be familiar with the HTML-based desktop computer version. After the tutorial, each participant was allocated twenty minutes to work on tasks. To minimize order effect associated with the same participant testing on various interfaces, the order of the task sets presented to the participants was counterbalanced. After a given set consisting of five questions, ten minutes each, the participants were presented with the twenty product reaction cards to select from that best described their experiences with the interface. Based on the chosen cards, the HSL's Web Committee member serving as the moderator asked for further comments and suggestions from the participants. There was no time limit for the interview process; in general, it was completed within five minutes.
Data collection
Observing a participant's interaction with small-screened mobile devices is challenging, therefore a vocal protocol was employed, and participants were asked to speak about their experience for manual recording. The selected cards and guided interview results were also manually recorded.
RESULTS
Usability
Paired t-tests were used to analyze the study results for hypotheses #1 and #2 (Table 1, online only). Compared to the non-optimized website, the optimized website appeared to significantly improve the effectiveness and efficiency of information retrieval on smart phones. The number of correctly answered questions was 19% higher on average ((SE = 0.22), t (11) = 3.45 (one-tailed), P = 0.00027, Cohen's d = 1.04), while the response time with the optimized website was reduced by 0.77 minute and 9.9% on average ((SE = 0.18), t(11) = 4.36 (one-tailed), P = 0.0005, Cohen's d = 1.32).
For hypothesis #3, quantitative values could not be obtained with product reaction cards. However, of the same set of cards provided for selection, participants selected two negative words (slow and unrefined) to describe the optimized website and eight negative words (confusing, difficult, hard to use, ineffective, not valuable, slow, time consuming, and unrefined) to describe the non-optimized website (Figures 1 and 2). One interpretation for this difference was that users were more satisfied with information retrieval using the optimized website; however, participants also chose understandable, useful, and valuable to describe the non-optimized site.
Figure 1.
Positive reaction cards selected by participants (n = 12)
Figure 2.
Negative reaction cards selected by participants (n = 12)
Comments made in the guided interviews offered a variety of suggestions for improving the site (select comments are available in Appendix B, online only). One of the tasks required participants to fill out the Ask-a-Librarian form. Participants were observed becoming impatient and frustrated with the many required fields. Their frustration was reflected in the comments gathered during the guided interview. For example, three participants asked to have forms pre-populated, and two asked to have both the phone number on the class registration and the fax number on the Ask-a-Librarian forms to be removed. Others wanted to set up a favorites list and to have a more prominent home button.
Usage statistics
The HSL implemented Google Analytics to gather statistics on the use of the optimized mobile website. A content analysis of “pageview” between the months of September 2010 and April 2011 recorded the top 10 content viewed as: hours, databases, email help, Ask-a-Librarian, Micromedex, MD Consult, library direction, PubMed, mobile technology resources, and classes. However, the search strings did not reveal a pattern of users' searching behaviors because they were too disparate to cluster. The statistics also showed use of web pages that were unavailable from the quick links and could only be accessed through the full site link. This information supported the benefits of the One Web recommendation to provide a full site link so all resources are available on mobile devices. There were more than 2,000 “absolute unique visitors” to the optimized site, with the majority of visits coming from iPhone (31.2%) and Android (26.8%), followed by iPod (6.8%), Blackberry (3.2%), and others, including iPad, Windows, Samsung, Nokia, and PalmOS.
DISCUSSION
The HSL has implemented an optimized mobile website that provides quick links as well as a full site link to the library's rich content and services. The findings from this usability study indicate an improvement in effectiveness and efficiency for information retrieval when the optimized website is used. The smaller number of negative production cards selected for the optimized website indicates a more satisfying user experience with the optimized content screen. However, due to the small number of faculty and staff in the sample, the results cannot be generalized to the nonstudent user population. Although the participants were encouraged to provide discussion during the guided interviews, their comments were brief. Nevertheless, the Microsoft desirability toolkit provided valuable enhancement suggestions that would not have been obtained by the Likert-scale questionnaire.
Usage of the optimized mobile website has not increased since statistics collection began. The HSL will promote the mobile website by distributing a tutorial to incoming students through an electronic orientation program, by demonstrating its use in classes, and by promoting it via the library's blog, newsletter, and quick response (QR) code marketing. The development team is exploring options for obtaining continuous user feedback and is observing the rapid advancement in mobile technologies to improve the HSL's optimized mobile website.
Electronic Content
Footnotes
Supplemental Appendix A, Appendix B, and Table 1 are available with the online version of this journal.
REFERENCES
- 1.International Data Corporation. Android Rises, Symbian∧3 and Windows Phone 7 launch as worldwide smartphone shipments increase 87.2% year over year, according to IDC [Internet] Framingham, MA: The Corporation; c2011 [updated 7 Feb 2011; cited 10 Apr 2011]; < http://www.idc.com/about/viewpressrelease.jsp?containerId=prUS22689111>. [Google Scholar]
- 2.Pew Research Center. Mobile access 2010 [Internet] Washington, DC: The Center; c2010 [updated 7 Jul 2010; cited 28 Mar 2010]; < http://www.pewinternet.org/∼/media//Files/Reports/2010/PIP_Mobile_Access_2010.pdf>. [Google Scholar]
- 3.Epocrates. 5th Annual Future Physicians of America Survey [Internet] San Mateo, CA: Epocrates; c2011 [updated Sep 2010; cited 3 Nov 2010]; < http://www.epocrates.com/company/mediaroom/mediaresources/statistics.html>. [Google Scholar]
- 4.Moore A. Detect mobile browsers: detect and redirect mobile browsers on your website [Internet] [updated 2010; cited 1 Jun 2010]. < http://www.detectmobilebrowsers.mobi>.
- 5.World Wide Web Consortium. One web [Internet] The Consortium [updated 29 Jul 2008; cited 1 Jun 2010]. < http://www.w3.org/TR/mobile-bp/#OneWeb>.
- 6.Wichansky A. Usability testing in 2000 and beyond. Ergonomics. 2000 Jul;43(7):998–1006. doi: 10.1080/001401300409170. [DOI] [PubMed] [Google Scholar]
- 7.International Organization for Standardization. 9241-11: ergonomic requirements for office work with visual display terminals (VDTs)-part 11: guidance on usability. The Organization; 1998.
- 8.Zhang D, Adipat B. Challenges, methodologies, and issues in the usability testing of mobile applications. Intl J Hum-Comput Interaction. 2005;18(3):293–308. [Google Scholar]
- 9.Travis D. Measuring satisfaction: beyond the usability questionnaire [Internet] Userfocus [updated 22 Jul 2009; cited 15 Apr 2011]. < http://www.userfocus.co.uk/articles/satisfaction.html>.
- 10.Barnum C.M, Palmer L.A. More than a feeling: understanding the desirability factor in user experience. pp. 4703–15. Proceedings of the 28th of the International Conference Extended Abstracts on Human Factors in Computing Systems; Atlanta, GA; 10–15 Apr 2010.
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.


