Abstract
Background:
HIV TIDES — tailored interventions for self-care management of depressive symptoms for people living with HIV/AIDS (PLWHA) — provides assessment of depression and tailored education on self-care strategies to decrease the risk of developing clinical depressive disorders. The iterative refinement of the prototype is an important process during system development to ensure that the design of the system is easy to use and useful.
Methods:
The heuristic evaluation and usability testing methodologies were used to guide the iteration of the HIV TIDES.
Results:
The system's compliance with the majority of usability concepts and current standards was confirmed by three experts on human-computer interaction in the heuristic evaluation. However, a number of usability problems were identified. Refinements were made based on experts' recommendations prior to the usability testing. The usability testing included six PLWHA with various levels of computer experience. Data from this iterative testing informed the refinement of key pages and the development of new features.
Conclusions:
The final version of HIV TIDES consists of 73 messages. The average readability level of the messages is 6.0 based on the Flesch-Kincaid Grade Level and the average word count is 103.
Keywords: depressive symptom, HIV/AIDS, tailoring, self-care management, heuristic evaluation, usability testing
Summary Points:
What was known before the study?
Usability evaluation is a necessary step for improving system ease of use and usefulness during the system development process.
Heuristic evaluation with usability experts provides efficient feedback at a relatively low cost in the early stages of system development.
Usability testing explores sustainability problems from the perspective of system end users.
What has the study added to the body of knowledge?
In order to evaluate a tailoring system, testing scenarios should be carefully designed to reflect all functions that were included in the system.
Discussion with experts at the end of the heuristic evaluation session is a critical step prioritizing the usability problems and exploring possible solutions for system iterations.
The involvement of PLWHA with various computer backgrounds in the usability testing enhances the likelihood that the system fits the needs of and is acceptable to both novice and experienced computer users.
1. Introduction
Depression has been reported as one of the most prevalent mental health problems in people living with HIV/AIDS (PLWHA) and has also been identified as a significant factor related to treatment non-adherence in this population. In a focus group needs assessment with HIV field workers [1], it was suggested that PLWHA have unmet mental health needs that continually change at different stages of disease progression. Healthcare providers must take an active role in introducing PLWHA to information on subjects such as recognizing high-risk symptoms, strategies for coping with psychiatric co-morbidity, and treatment options for mental health problems. This information should be tailored to individuals' specific needs, taking into account gender, age, primary language, level of literacy, and cultural background.
In order to decrease the levels of depressive symptoms and maintain the mental health status of PLWHA, HIV TIDES — tailored interventions for self-care management of depressive symptoms in PLWHA — was designed to provide education on key elements for depression management, and on following the recommended strategies. The framework of HIV TIDES was developed based on the constructs of the Social Cognitive Theory [2], and revisions were made following the results of the needs assessment [1]. The system was created with Dreamweaver MX 2004 by using HTML, JAVA script, and PHP computer language, and was supported by MySQL database.
During the system development, multiple sessions of usability evaluation for design iteration are necessary to ensure that the final prototype is easy to use and useful. Heuristic evaluation with experts and usability testing with system end-users are two methods frequently used at different stages to guide system modification.
Heuristic evaluation is an informal usability inspection technique developed by Nielsen and his colleagues [4] that is usually used at the early stages of system development. A limited number of experts, guided by a set of usability principles known as heuristics, evaluate whether user-interface elements, such as dialogue boxes, menus, navigation structure, online help, etc., conform to a set of accepted principles. On the other hand, the aim of usability testing is to observe system end users using the designed prototype in as realistic a situation as possible, in order to discover errors and areas for improvement [5]. Usability testing generally involves measuring how well test subjects respond in four areas: time, accuracy, recall, and emotional response. The mathematical model that Nielsen and Landauer [6] established for identifying usability problems indicates that five users can identify 85% of a system's usability problems, while at least 15 users are required to adequately identify most of a design's usability problems.
The aim of this study was to evaluate the design of HIV TIDES from the perspective of usability experts and end users. The research questions were:
What usability problems are identified by experts in the human-computer interaction and the user interface design of the HIV TIDES prototype during heuristic evaluation?
What usability problems are identified by PLWHA while using the HIV TIDES prototype?
2. Methods
2.1. Heuristic Evaluation
2.1.1. Sample and Sample Size
Since all the participants in a heuristic evaluation are experts, a smaller sample size, from three to five participants, is considered to be acceptable [7]. Hence, three usability experts were invited to participate in the heuristic evaluation of HIV TIDES.
2.1.2. Concepts and Instruments
Nielsen's concepts for heuristic evaluation were used to guide the design of the heuristic evaluation sessions [4]. The concepts covered ten necessary aspects of human-computer interaction that should be considered in consumer-oriented interactive system development, including: visibility of system status; match between system and the real world; user control and freedom; consistency and standards; help users recognize, diagnose, and recover from errors; error prevention; recognition rather than recall; flexibility and efficiency of use; aesthetic and minimal design; and help and documentation [5]. Three to nine questions specifically targeting the needs for HIV TIDES development were selected for each usability concept from the heuristic evaluation references [5]. A total of 58 questions were verified by five doctoral level peer reviewers in related fields and then used to facilitate the evaluation process.
Experts were also asked to assign a severity rating to each usability factor that violated the usability concepts. The severity of the usability problems was categorized into five levels, as suggested by Nielsen and Molich [8], including: 0- no usability problem, 1- cosmetic problem only, 2- minor usability problem, 3- major usability problem, and 4- usability catastrophe.
2.1.3. Data Collection Procedure
Experts were not considered research subjects; consequently, IRB approval was not sought for this phase of the study. The participating experts were invited through e-mails. Heuristic evaluation was conducted at either the expert's office or a private office, depending on the expert's preference. Each expert received tasks that represented four of the five user types. Each user type was tested by at least two participating experts.
The evaluation sessions were audio taped with the experts' permission using a digital recorder. The investigator also took notes regarding the problems that each expert identified. The investigator reviewed the list of usability factors with the participating experts at the end of each session to confirm that all the interface design concerns were addressed and commented on and to identify possible solutions for the observed problems.
2.1.4. Data Analysis
All the experts' comments were merged after the completion of the three heuristic evaluation sessions. Descriptive statistical methods were used to analyze the severity ranking of the usability factors. The comments were summarized based on three categories developed by Preece [5]: navigation, access, and information design.
2.2 Usability Testing
2.2.1. Sample and Setting
Users were recruited from an HIV clinic in East Harlem, NYC. The inclusion criteria were HIV-positive and English literate. To represent the different levels of problems that end users might encounter while using HIV TIDES in the field, six PLWHA were recruited for the usability tests, including three sophisticated computer users who reported frequent usage of a computer and the Internet for more than six months, two intermediate computer users who had less than six months' experience with computers and the Internet, and one computer novice. The test sessions were carried out in the usability laboratory using Morae® software at Columbia University School of Nursing.
2.2.2. Instruments
Scenarios reflecting six tailoring paths and three supportive functions (“help,” “contact us,” and “privacy policy”) were created and commented on by an informatics expert before the usability testing sessions. The observations in this phase reflected the questions that are recommended by the usability Web site developed by the Department of Health and Human Services [9], including:
Do users complete a task successfully?
What paths do they take in trying?
Do those paths seem efficient enough to them?
Where do they stumble?
What words or paths are they looking for that are not now on the site?
Since users with different levels of computer experience were recruited to perform different combinations of tasks, the user performance measures did not include time spent completing the task.
2.2.3. Data Collection Procedure
After receiving IRB approval, sample recruitment began through flyers posted and distributed in the HIV clinic. Potential participants were screened for computer experience during the first telephone encounter, and individual appointments were scheduled for qualified callers. Candidates were required to sign an informed consent form in order to enter the study.
Each participant received three tasks that were associated with two user types and one supportive function to get him/her started and to keep him/her focused during the test. Think aloud protocol and non-participated observation were used for data collection. Participants were asked to think out loud while using the system. Participants' voices and computer screenshots were recorded using Morae® software for data analysis.
The investigator provided a brief orientation to the participant who had no prior computer experience, including instructions on the basics of using the keyboard and the mouse. For the rest of the participants, who had some computer experience, the investigator made suggestions for actions only when the participant seemed to stall, went around in circles, or requested assistance. The investigator also took notes on any incidents that occurred during the session. Participants received a cash honorarium ($20 per person) at the completion of the session.
2.2.4. Data Analysis
The descriptive summary of video evidence was used to illustrate problems that users encountered. The iteration of the prototype was based on the findings from these usability test sessions.
3. Results
3.1. Heuristic Evaluation
Three experts in interface design, human-computer interaction, and tailored health informatics were invited to participate in the heuristic evaluation. Each of the participating experts was asked to complete four of the six designed tasks. The average length of each heuristic evaluation session was around one hour.
3.1.1. Summary of the Severity Rating
Twenty-two usability problems were identified. Problems were found for all ten heuristic concepts; most of the problems were ranked as minor (n=8) or major (n=7). For ”Match between system and the real world,” “Error prevention,” and “Flexibility and efficiency of use,” three usability problems were identified in each category (Table 1). Only one problem was found for “Recognition rather than recall.” Two usability problems were identified for each of the other six factors.
Table I.
— Overall Severity Ranking of the Heuristic Evaluation (Range 1−4)
| Nielsen's Concepts | Number of Problems | Range of Severity Rating |
|---|---|---|
| Visibility of system status | 2 | 0−3 |
| Match between system and the real world | 3 | 0−3 |
| User control and freedom | 2 | 0−2 |
| Consistency and standards | 2 | 0−3 |
| Help users recognize, diagnose, and recover from errors | 2 | 0−3 |
| Error prevention | 3 | 0−2 |
| Recognition rather than recall | 1 | 0−1 |
| Flexibility and efficiency of use | 3 | 0−4 |
| Aesthetic and minimal design | 2 | 0−3 |
| Help and documentation | 2 | 0−3 |
| Total | 22 | 0−4 |
Among these 22 usability problems, eighteen of them were considered a problem by only one expert. Four more factors were identified by two experts. Generally, experts agreed on the problem severity of the usability factors that were identified by more than one expert. For only one factor, ”Visibility of system status — Is the menu-naming terminology consistent with the user's task domain?” the experts had different opinions on the problem's severity level. While the second expert considered it to be a cosmetic problem, the third expert suggested that it was a major one.
Comparing the number of problems identified by these three experts, there was a significant difference among them. The first expert identified only three minor usability problems, while the second expert identified nine usability problems and considered three of them to be major problems. The third expert picked up 17 of the 22 identified usability problems and suggested that seven of them were either major problems or should be considered as usability catastrophes.
3.1.2. Analysis of the Comments
The usability problems identified by the experts were categorized into three groups: navigation, access, and information design [5]. Most of the comments were related to the design of the information (n=16), followed by navigation (n=6), and access (n=4). Not all of the experts' comments could be reflected by Nielsen's heuristic concepts. An example pertaining to system access was “adding the time-out function for privacy protection.” Other comments suggesting revision of the system's content were also not covered by Nielsen's heuristics.
i. Navigation
Six usability problems related to navigation were identified by the experts (Table 2). Four of the problems were related to the concept of “Flexibility and efficiency of use.” The other two were related to “Help users recognize, diagnose, and recover from error,” and “Help and documentation.”
Table II.
— Heuristic Evaluation: Summary of Comments - Navigation
| Heuristic Concept | Comments | Change |
|---|---|---|
| Help users recognize, diagnose, and recover from errors | Highlight the error when the system has detected one | No |
| Flexibility and efficiency of use | Cut down long pages to avoid scrolling | Pending |
| Allow users to explore other sections in the system | No | |
| Set the cursor to automatically move to the next data input field on the registration page | No | |
| Allow non-alcohol users to skip alcohol-related questions at user exclusion | Pending | |
| Help and documentation | Add links to outside info for related topics | No |
Among those problems identified, only one was corrected immediately after the heuristic evaluation. Questions on the original user screening page were separated into two pages. Individuals who report current alcohol use now have to answer two advanced questions that are not required for those who answer ‘no’ to the same question. Suggestions to highlight the error, to cut down the long questionnaires, and to allow users to explore other parts of the tailoring system were put on hold to be addressed at a later time. If the same problems were observed in the usability testing sessions, the changes would be made. The experts also commented on the lack of hyper-linkages to information in other Web sites. This was not considered a problem for the investigator since the purpose of the system development did not allow the user to search information outside of HIV TIDES. The experts also wanted to see the cursor automatically move to the next column for the date of birth input. This issue was not changed because this function might confuse the novice user.
ii. Access
Four identified usability problems were categorized as access problems (Table 3). Two problems related to ”User control and freedom” and one related to “Flexibility and efficiency of use.” One comment concerning the system time-out function to protect privacy and data confidentiality did not fit into any of the categories of Nielsen's proposed concepts.
Table III.
— Heuristic Evaluation: Summary of Comments - Access
| Heuristic Concept | Comments | Change |
|---|---|---|
| User control and freedom | Allow users to change answers after data submission | No |
| Allow users to go back and forward to different pages | No | |
| Flexibility and efficiency of use | Make the system work for both IE and Netscape browser | No |
| NOS | Add time-out function for info protection | Yes |
Note: NOS = no other specified usability concepts.
The time-out function was added to the system right after the heuristic evaluation. System changes were not made to reflect any other comments regarding access. For tailoring purposes, the system users were not allowed to change their answers after clicking the submission button. Nor would they be allowed to freely return to any page in the system after passing the page for tailoring assessments, both primary and secondary. HIV TIDES failed when the experts tried to open it with different Web browsers. Considering the limited time and resources, the system design was only focused on compatibility with the requirements of Internet Explorer.
iii. Information Design
Sixteen usability comments were related to problems with information design and mostly covered by six usability concepts (Table 4). Two suggestions to revise the system's content in order to protect users' privacy and to increase their willingness to complete the assessment did not fall under any of the categories of the ten usability concepts. Four problems related to the concept of “Error prevention” were significant flaws that were modified right after the heuristic evaluation, including adding the number check function for the date of birth entry, replacing underlined hyper-links with buttons for navigation, replacing underlined font with bold to avoid confusion, and highlighting important instructions with bold font. Three problems relating to the concept of “Aesthetic and minimal design” were also modified. Pictures were added to the messages on a one-page-one-picture basis. The human figures in the pictures were mostly Black or Hispanic to reflect the population composition of the target community. The instructions on the user login page were revised. Redundant messages were rewritten to leave more empty space on the page. One problem related to “Help and documentation” was the typos found on the registration page. The other comment in this category suggested revision of the assessment choice options.
Table IV.
— Heuristic Evaluation: Summary of Comments - Information Design
| Heuristic Concept | Comments | Change |
|---|---|---|
| Visibility of system status | Add page numbers or a visual cue on pages to indicate how many pages the user has gone through and can expect | Pending |
| Match between system and the real world | Change the yellow/green color at the assessment pages | No |
| Need more information about the CES-D Assessment | Yes | |
| Move the assessment result to the top of each message | No | |
| Consistency and standards | Enlarge the font size of the messages | No |
| Error prevention | Add a number-check function for the birth date entry | Yes |
| Change the underlined sub-headings to avoid confusion | Yes | |
| Replace underlined links with buttons | Yes | |
| Highlight the description of choice options for each assessment tool | Yes | |
| Aesthetic and minimal design | Revise the login instruction to avoid confusion | Yes |
| Add more pictures to the pages | Yes | |
| Decrease the amount of content to create empty spaces | Yes | |
| Help and documentation | Correct two typos on the registration page | Yes |
| Simplify the long questionnaire instructions and the description of choice options | Yes | |
| NOS | Remove "take a few minutes" from the message to avoid intimidating users | Yes |
| NOS | Remove the word "HIV" in the messages to protect privacy | Yes |
Note: NOS = no other specified usability concepts
The experts reported three usability problems related to “Match between system and the real world.” As a result, an explanation of the CES-D assessment was added. However, changes were not made to reflect the other two problems identified; the color preference for interface design and the order of the paragraphs in each message were considered to be the specific evaluator's subjective opinions. In addition, although one expert suggested that a visual cue be added to the pages to show the system status, the investigator decided not to make any change at this stage and to wait until more evidence is collected through usability testing.
At the end of the heuristic evaluation phase, the system iteration was based on 14 identified usability problems. Five of the identified problems were shelved to be confirmed by subsequent usability tests. These were related to information design (n=3), error correction (n=1), and user freedom (n=1). Seven problems were dismissed because the suggestions were inconsistent with tailoring design principles (n=3) and current system requirements (n=1), did not adhere to system consistency (n=1), or were identified as personal preferences of the experts (n=2).
3.2. Usability Test
Six PLWHA were recruited for usability testing (Table 5). Participants included one female and five males that represented novice (n=1), intermediate (n=2), and expert computer users (n=3). Four participants who reported current computer use indicated that they either use a computer at home or in the public library. Although most participants had prior computer experience, they were not familiar with the Internet and e-mail. Usability problems were summarized under i) basic computer operation skills, ii) response to the system messages, iii) system navigation, and iv) the design of the message/instruction.
Table V.
— Background of the Usability Testing Participants
| Gender | Computer exp | Frequency of use | Internet | Test scenarios | ||
|---|---|---|---|---|---|---|
| 1 | M | No | N/A | N/A | N/A | 5, 6, help |
| 2 | M | 1−6 m. | No | No | No | 1, 5, privacy |
| 3 | F | 1−6 m. | <1hr/d | No | Yes | 4, 6, help |
| 4 | M | 6−12 m. | 8hr>d | Yes | Yes | 2, 3, contact |
| 5 | M | 6−12 m. | 4−6hr/wk | Yes | No | 1, 4, privacy |
| 6 | M | 6−12 m. | 2hr/wk | Yes | No | 2, 3, contact |
Note: N/A = Not applicable.
Usability problems with basic computer operation skills were related to 1) answering questions using the free text field, drop-down menu, and/or radio buttons; 2) changing answers in the free text field, drop-down menu, and/or radio buttons; 3) scrolling down on long pages; and 4) maximizing or minimizing the size of the window. Users also encountered problems when responding to the system messages, such as the error warning, the confirmation for printing demand, and the closing of pop-up windows. System navigation was a problem for individuals who did not have Internet experience, as these participants did not know that they were supposed to click on buttons to progress to the next page or to submit input data.
The rest of the usability problems were related to the design of the messages. Some users ignored the suggested format of date of birth on the registration page and many others felt that the instructions for goal setting to increase level of social activities were not clear. It was also suggested that excluded users who have current substance abuse problems should still have the opportunity to assess their current level of depressive symptoms.
Based on the results of the usability testing, the registration page of HIV TIDES was redesigned. The drop-down menu and the birth date input format were revised. The items for social activity goal setting were also dispersed amongst two pages. In order to avoid confusion, the first page asks the user to pick activities that do not involve other people, while the next page requests that he/she choose activities requiring social interaction. In order to avoid scrolling on the pages, there was a significant change regarding the length of the pages, especially for questionnaires. A short training program was designed to orient potential system users and to help them practice the basic skills before they start HIV TIDES.
The revision of the content showed a decrease in the average word count from 109 to 103. Regarding the level of readability, there was a decrease from 6.6 to 6.0 based on the Flesch-Kincaid Grade Level provided by Microsoft Office Word 2003 (Table 6). Major changes were made to the questionnaires (word count from 240 to 147 and readability from 6.6 to 5.1). The total number of messages increased from 65 to 73.
Table VI.
— Summary of Changes in Average Readability and Word Count
| File type | Before HE | After HE | After UT | |
|---|---|---|---|---|
| Intervention | # of Messages | 37 | 38 | 38 |
| Readability | 6.4 | 6.8 | 6.4 | |
| Word count | 97.9 | 97.5 | 98.4 | |
| Intervention | # of Messages | 5 | 6 | 6 |
| /Tailored | Readability | N/A* | N/A* | N/A* |
| messages | Word count | N/A* | N/A* | N/A* |
| Questionnaire | # of Messages | 6 | 7 | 14 |
| Readability | 6.3 | 6.6 | 5.1 | |
| Word count | 275.8 | 239.7 | 146.9 | |
| Support | # of Messages | 15 | 15 | 16 |
| Readability | 5.49 | 6.2 | 6.1 | |
| Word count | 80.9 | 86.3 | 81.4 | |
| Total | # of Messages | 63 | 65 | 73 |
| Readability | 6.1 | 6.6 | 6.0 | |
| Word count | 111.9 | 109.2 | 102.7 |
Note: HE = Heuristic Evaluation; UT = Usability Testing.
Readability and word count did not apply to tailored intervention messages.
4. Discussion
According to Fowler [10], “Usability is the degree to which a user can successfully learn and use a product to achieve a goal.” An information application should be user-centered, in that it is based on knowledge of the target users, in particular their technological and physical capacities, their cultural context, and their information needs [11]. The central goal of formal usability testing is to determine the primary areas of difficulty in the current approach to experiment generation and to assess the overall level of a system's usability. In addition, usability testing can identify bugs and other design flaws, so that they may be corrected.
In this study, two usability evaluation methods were utilized to improve the design of HIV TIDES. The two methods, heuristic evaluation and usability testing, helped us to obtain both the experts' point of view and the system end users' real experiences interacting with the prototype. The iteration of the HIV TIDES development was based on the comments from each evaluation session. Some serious usability problems were corrected immediately after a single evaluation session without waiting for the completion of all scheduled sessions. Nielsen and Landauer 6] indicated that the best results in usability evaluations come from carrying out as many small tests as possible and that correcting most of the problems before the end of the usability tests will result in a more effective and pleasant experience for the later participants [12].
Lathan et al. [13] suggested that during usability evaluation experts tend to focus on “making things work,” or what is referred to as “functionality,” while end users are usually more interested in performance, or their ability to use the technology efficiently and effectively. Comparing the results from the heuristic evaluation to those of the usability testing, it was not surprising to observe that experts and end users are faced with different requirements and tend to focus on different sets of issues.
The study results suggested that experts reported more problems related to information design, but were less concerned with navigation and access. However, system end users in the usability testing had more concerns related to system navigation, access, and basic computer operating skills, such as how to move to the next page or how to change an answer, but had very few questions about information design.
Comments received in the usability evaluation sessions regarding the design of HIV TIDES were very similar to experiences reported by developers of other disease management systems [14-19]. It was proposed that navigation and information design should be clear, use frames intuitively, have one item per page, require minimal scrolling, and fit page content within the frame body [14]. Regarding the content composition to assist comprehension, Naismith and Stein [15] suggested that a continuum of strategies, such as using explanatory phrasing and providing glossaries, should be employed in written and verbal communication to bridge the gaps in understanding.
The design aspects identified as inappropriate for seniors by experts included: small font size, too much information on one page, and a failure to provide instructions [16]. The experts suggested that older adults prefer a simple design with clear instructions since they are easily distracted from performing tasks due to a cluttered screen and have a difficult time finding the correct links; scrolling should also be avoided [17-9]. Some of these design principles could actually be applied to individuals who do not have much knowledge about computer technology or have limited computer experience, such as the system end users of HIV TIDES in this project. Participants in the usability testing also expressed difficulties in performing the scrolling function and in understanding instructions to perform some tasks.
There were several limitations of this study. The heuristic evaluation was conducted first during the system iteration process to evaluate and refine HIV TIDES. This method is comparatively inexpensive and quick compared to other evaluation methods, because users and special facilities are not needed. However, it was a concern that experts might be biased toward the design of the system. One drawback of heuristic evaluation is that it does not provide design solutions; instead, it simply identifies usability problems based on heuristic principles. Also, it does not address the positive aspects of the design. Consequently, a brainstorming session was conducted by the investigator at the end of the process to generate design ideas. Comments were discussed at the end of each expert's session to explore possible solutions for the identified usability problems. General usability principles were also used as guidelines for fixing or redesigning HIV TIDES before the usability tests with end users. However, with a small number of experts, it may not be possible to capture all the problems that target users might encounter.
Usability tests with system end users were conducted after the heuristic evaluation to evaluate the design of HIV TIDES. The think aloud technique has limitations for usability testing. The main limitation is that it seems unnatural to most users to think aloud when using a system and the tasks could feel harder to perform as a result. Different case studies have also shown that verbalization could affect users' normal performance [20-1]. In this study, inexperienced users especially had problems verbalizing what they were thinking. Sometimes they did not know what to do or what to say because they had limited prior experience using a computer. The investigator had to take an active role in assisting them by asking questions or suggesting the proper actions. A brief orientation on basic computer skills and system flow was necessary before they could start their individual usability testing sessions.
Another important limitation was that the think aloud method requires a lot of interpretation from the experimenter [5]. However, the investigator was able to catch the major problems in the collected data by discussing the observed problems with participants when each usability problem was encountered and at the end in the debriefing session. HIV TIDES was refined based on the results of the usability tests and was highly accepted by PLWHA. Furthermore, the training session was designed especially for individuals who had limited prior computer experience. It helped them to learn and practice basic computer skills and to familiarize themselves with the functions provided by HIV TIDES.
Confidentiality and privacy concerns are especially crucial for studies recruiting PLWHA or focusing on mental health-related research. In this study, these issues were addressed by using coded numbers for case identification and eliminating the audio and video tape records of the testing sessions and transcripts after the completion of the final report of this study.
5. Conclusion
The design process of HIV TIDES involved experts and system end users at different stages to ensure that the content and the user interface design of the prototype were appropriate. Findings from the usability evaluation sessions could help other designers of consumer informatics applications to better understand the usability problems that might be encountered by system end users with limited computer experience and to provide proper training before the introduction of an information system. The method of involving novice computer users in the usability testing provided valuable information regarding the primary user interface design problems that limited the system user's interactions with the system. The system design was modified to reflect this user's needs. Usability experts and system end users provide valuable and complementary expertise that is essential in the development of systems that are perceived to be both easy to use and useful.
Acknowledgement
This project was supported by T32 NR 007969: Reducing Health Disparities through Informatics and P20 NR 007799: Center for Evidence-based Practice in the Underserved.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References
- 1.Lai T, Bakken S. Caseworkers' perceptions of the mental health information needs of persons living with HIV/AIDS. Stud Health Technol Inform. 2006;122:30–35. [PubMed] [Google Scholar]
- 2.Bandura A. Social learning Theory. Prentice-Hall; New Jersey: 1977. [Google Scholar]
- 3.Lai T, Jenkins M, Bakken S. Tailoring intervention for depressive symptoms in HIV-infected African American women; Proceedings of the MedInfo; 2004; CD 1705. [Google Scholar]
- 4.Nielsen J. Heuristic evaluation. In: Nielsen J, Mack RL, editors. Usability Inspection Methods. John Wiley & Sons; New York: 1994. [Google Scholar]
- 5.Preece J. Online Communities: Designing Usability, Supporting Sociability. John Wiley & Sons; Chichester, UK: 2000. [Google Scholar]
- 6.Nielsen J, Landauer TK. A mathematical model of the finding of usability problems, Proceedings of ACM INTERCHI'93 Conference; 1993.pp. 206–213. [Google Scholar]
- 7.Nielsen J. Usability Engineering. Boston: Academic Press; Boston: 1993. [Google Scholar]
- 8.Nielsen J, Molich R. Heuristic evaluation of user interfaces, CHI '90 Conference Proceedings; 1990.pp. 249–256. [Google Scholar]
- 9.Department of Health and Human Services . Methods for designing usable web sites: conducting and using usability tests. Department of Health and Human Services; 2004. [October 20, 2004]. http://www.FirstGov.gov. [Google Scholar]
- 10.Fowler S. Appendix B: usability tests, in GUI Design Handbook. McGraw-Hill; New York: 1997. [December 2, 2005]. http://www.fastconsulting.com/appb.htm. [Google Scholar]
- 11.Monash University [December 2, 2005];ITS usability workshop. 2001 http://www.its.monash.edu.au/web/slideshows/usability/all.htm.
- 12.Schneider W, Bolger DJ, Eschman A, Neff C, Zuccolotto AP. Psychology Experiment Authoring Kit (PEAK): formal usability testing of an easy-to-use method for creating computerized experiments. Behav Res Methods. 2005;37:312–323. doi: 10.3758/bf03192699. [DOI] [PubMed] [Google Scholar]
- 13.Lathan CE, Sebrechts MM, Newman DJ, Doarn CR. Heuristic evaluation of a web-based interface for internet telemedicine. Telemed J. 1999;5:177–185. doi: 10.1089/107830299312140. [DOI] [PubMed] [Google Scholar]
- 14.Ebenezer C. Usability evaluation of an NHS library website. Health Info Libr J. 2003;20:134. doi: 10.1046/j.1365-2532.2003.00450.x. [DOI] [PubMed] [Google Scholar]
- 15.Naismith R, Stein J. Library jargon: student comprehension of technical language used by librarians. College and Research Libraries News. 1989;50:543–552. [Google Scholar]
- 16.Nahm ES, Preece J, Resnick B, Mills ME. Usability of health web sites for older adults: a preliminary study. Comp Inform Nurs. 2004;22:326–334. doi: 10.1097/00024665-200411000-00007. [DOI] [PubMed] [Google Scholar]
- 17.Ellis RD, Kurniawan SH. Increasing the usability of online information for older users: a case study in participatory design. Int J Hum Comput Interact. 2000;12:263–276. [Google Scholar]
- 18.Lustig C, Tonev S, Hasher L. Visual distraction and processing speed, Cognitive Aging Conference; Atlanta, GA. 2000. [Google Scholar]
- 19.Morrell RW, Dailey SR. The Interactive AgePage online learning project, Second Biennial Conference: Older Adults, Health Information, and the World Wide Web; Bethesda, MD. 2001. [Google Scholar]
- 20.Berry DC, Broadbent DR. The role of instruction and verbalization in improving performance on complex search tasks. Behav Inf Technol. 1990;9:175–190. [Google Scholar]
- 21.Nisbett RE, Wilson TD. Telling more than we know: verbal reports on mental process. Psycho Rev. 1977;84:231–241. [Google Scholar]
