Abstract
Drchrono is an Electronic Healthcare Record (EHR) application designed specifically for the iPad. Described as portable, efficient and accessible anywhere, drchrono has several features that might be attractive to health care providers. However, EHRs have to conform to certain Federal Healthcare Information Technology Guidelines, which are evaluated in a series of 12 test procedures, defined by the Office of the National Coordinator for Health Information Technology (ONC). In this study, we evaluated Test Procedure for §170.302 (c) Maintain up-to-date problem list, in drchrono. The methodology for our evaluation was contained within Zhang’s unified framework for usability, using UFuRT, i.e. user, functional, representation and task analyses. Based upon the analysis, using Adobe Flex, we then designed a prototype that corrected or improved on perceived weaknesses in the functionality served by the test procedure. We applied the test procedure taxonomy to a prototypic modification of drchrono, and then repeated the UFuRT usability analysis. We also used a 14-item heuristic evaluation by each member of our informatics team. Our findings support a conclusion that UFuRT is a valuable tool to evaluate EHR usability and that an “up-to-date problem” list may be customized, according to healthcare provider preference.
Introduction
The Electronic Health Record (EHR) is a developing technology defined as a systematic collection of electronic health information about individual patients or populations. (1) It is a digital record format that is capable of being shared across different health care settings, by being integrated in network-connected enterprise-wide information systems. The modern EHR includes a whole plethora of data in various formats, including demographics, medical insurance data, medical history, medication and allergies, immunization status, laboratory test results, radiology images, and vital signs. This record should enable automation, streamlining of workflow and increased patient safety, to mention a few. Therefore, current spending and allocation of resources into EHR development is rapidly escalating. However, an important consideration of any HER system is its “usability.” Usability refers to how usable a product is for a human user in terms of ease of use, ease of learning, subjective satisfaction, flexibility and customizability and user error. This is the information design that allows users to easily navigate through the system and efficiently embrace all the necessary features and functions to meet meaningful use requirements. It is vital for an EHR system to have an intuitive user interface, in order to give all users, regardless of their technical background, confidence in the system – allowing them to perform their daily tasks with less effort. Good usability is achieved through user-centered design using technologies to fit user needs and characteristics and not to train users to adapt to technology.
A more recent expansion of the EHR field has focused on the hand-held or portable desktop record or tablet personal computer, such as the Apple iPad. These systems enable the user the advantages of portability with rapid EHR access in different environments. A California-based company Drchrono has developed one such system, drchrono. Founded in 2009 by Daniel Kivatinos and Michael Nusimow, their product is relatively new, which prompted our investigation of its usability. The HHS/Office of the National Coordinator for Health Information Technology (ONC) has defined several standards, implementation guides and certification criteria to be used within well-defined test procedures, that current EHR system must adhere to. ONC has defined the standards, implementation guides and certification criteria used in this test procedure. (2) One such test procedure, titled “Test Procedure for §170.302 (c) Maintain up-to-date problem list,” focuses attention on the patient’s “problem list” or “list of active diagnoses.” We elected to evaluate drchrono by examining its usability with respect to test procedure §170.302 (c). (2)
Methodology
ANALYSIS OF DRCHRONO EHR
User Analysis:
The first step for the application of the unified framework for usability analysis proposed by Zhang et al (3) (UFuRT) is the User Analysis, displayed in horizontal and vertical dimensions. For the use case of the drchrono iPad EHR system, the horizontal dimension focuses the Users’ profession/job role or background; i.e. all healthcare professionals and related workers including physicians, nurses, dentists, medical students, secretarial staff, data entry personnel and project/office managers. The vertical dimension addresses the Users’ characteristics (education, computer literacy, frequency of use, distraction in their work environment, and availability of time). Vertical and horizontal dimension categories were quantified as High (H), Intermediate (I) or Low (L) as shown in Table 1.
Table 1:
Drchrono EHR System Users
| Users | Frequency of Use | Domain Knowledge | Available Time | Motivation | Work Environment | Software Knowledge | Computer Knowledge | System Support | ||
|---|---|---|---|---|---|---|---|---|---|---|
| Nurses | H | H | L | H | I | L | H | I | I | |
| Med St | H | I | I | H | I | L | I | I | I | |
| MD/DDS | I | H | L | I | L | L | I | I | I | |
| Data Entry | H | L | H | H | I | H | I | I | ||
| Secretary | L | L | I | L | I | I | I | I | ||
| Manager | H | I | H | H | L | H | H | I | ||
| IT Person | H | L | H | H | L | H | H | I | ||
As shown in Table 1, our user analysis identified many different users, each one with its own role in the system. By analyzing the figure, it is apparent that we have two completely different user backgrounds with high frequency of system use: IT and medical personnel. Our analysis also attempted to correlate users’ knowledge (computer or medical) with roles, motivation, time to learn, distractions etc. Most of the conclusions can be made by just analyzing the table; the important aspects are highlighted as follows.
Nurses and medical students differ in a few aspects when using the system. First of all, students may have more time to use the system, but they are still learning domain issues. Their motivation is high, but their work environment varies from low to intermediate depending on the field of study or work and assignments. The work environment of RNs can also vary depending on field of work or depending on the hospital. Some nursing positions, such as the intensive care unit, can be relatively busy, while others can be associated with less distraction..
In contrast, physicians and dentists have low to intermediate interest in using EHR systems, because this may change their daily routine. They also have little work environment distraction and not much available time to use the system.
It should be noted that secretarial and data entry personnel have different roles using the system. Secretaries for the most part are not the responsible person for entering data into the system. They deal with daily routine and domain issues. That is why they have low interest in using the system. Data entry personnel, on the other hand, have high motivation and use the system frequently, so that they will know each part of the system and do a good work. IT personnel and managers differ on the domain (medical) knowledge area, but they have high motivation and frequently use the system, either by administrative or technical reasons.
Functional Analysis:
The next step was the Functional Analysis, where we examined, from the bottom up, the interface’s function to perform domain necessary tasks and assess probable areas of improvement. To summarize the method, several screens of the application were decomposed into basic widgets and each widget was coded appropriately.
The three major screens identified were the Patient Screen where the user selects and view summarized patient information. (Figure 1), Chart Screen where the user selects a specific section of the patient’s chart information, like the problem section, and interacts with the information. (Figure 2) and the Problem Screen where the user updates an existing problem or creates a new patient problem. (Figure 3)
Figure 1.
“Patient Screen” from Drchrono EHR
Figure 2.
“Chart Screen” from Drchrono HER
Figure 3.
“Problem Screen” from Drchrono EHR
Each screen was decomposed to basic widget types. Then each widget was classified as either an object (B) or an operation (P). Afterwards, the widget is then classified either as a Domain Function (Y) that pertains to function that relates to the work or an Overhead Function (N) that pertains to a system level function, like closing the application. For example, the “Update” button widget that completes the creation or updating of a new problem would be identified as an Operation and as a Domain Function.
For each screen, pertinent widgets were identified (highlighted in red) and accounted for in the final assessment. By observation, the Patient Screen appeared to have a high count of Objects vs. Operations including high count of Overhead Functions over Domain Function. With the Chart Screen there appeared to be an equal number of Objects and Operations, including Domain and Overhead Functions. Finally with the Problem Screen, there were more Domain Functions and more Operations. Table 5 shows a total summary of the operations, objects, and types of functions.
Table 5.
Functional Analysis – Sunflower Prototype Summary Table
| Patient Screen | Chart Screen | Totals | |
|---|---|---|---|
| Operations | 4 | 14 | 18 |
| Objects | 3 | 8 | 11 |
| Domain Functions | 5 | 14 | 19 |
| Overhead Function | 2 | 8 | 10 |
Task Analysis:
GOMS (Goals, Operators, Methods, and Selection Rules) Analysis was performed on drchrono (4). With this method, each task, be it mental or physical, was recorded for each step required completing a task. The GOMS method was applied to the steps required to perform the use case outlined in the NIST guideline for each screen that the user encounters. As pertaining to the use case, six major goals were identified:
Navigating to the problem list – user upon start up will have to navigate to the Chart Screen to access the Problem List.
Record problem through the toggle button – using the “Edit” button, one method that the user records a new problem is toggling the “Edit” button.
Record problem through the autosearch – user have the option to type in the name of problem in the autosearch field to instantiate the recording of a new problem.
Read problem record from the “Active Problem” list – a tab button is available for the user to find and tap on an existing problem to read of the patient.
Read problem record from “Historical Problem” list- a tab button is available for the user to find and tap on a problem from a historical problem list of the patient.
Update the problem record- user navigates to a screen to perform changes to an existing problem and save the changes.
With each of these goals identified, a list of physical and mental tasks were identified and recorded.
Hierarchal Task Analysis (5) was also preformed on drchrono. With consideration for the Use Case, the analysis revealed 43 nodes and 4 levels of depth for the various tasks needed to perform maintenance of the problem list use case.
Heuristic (Representation) Analysis:
Representational analysis is a methodology for the study of the representational effect in distributed cognitive tasks. (6) and attempts to find the best display for every operation. Heuristic analysis is primarily at the level of representational analysis and is used to identify major usability problems of a product in a timely manner with reasonable cost. (7) Two evaluators used the 14 Nielsen–Shneiderman Heuristics to perform our analysis (8). There were six heuristic violations for the current system of drchrono using the “Test Procedure for §170.302 (c) Maintain up-to-date problem list.” This is a relatively small number of violations, since we were dealing with minimal functionality. The problems we found and solutions we propose, ordered by severity rating are:
Appointment Date and Time - Severity of 3.5, violated heuristics 4. Minimalist and 5. Memory Load. The appointment date and time are unnecessary tasks to get to the problem list and the solution is to get rid of the tasks and shorten time to do it.
Content Pane of Problem list - Severity of 3, violated heuristics 1. Consistency and 3. Match the system and world. It does not tell in the edit bar whether you should type the code or number problem. We have to clarify the edit tab type or number of the problem.
Left Pane and Content Pane - Severity of 2, violated heuristic 1. Consistency. The left pane buttons and the content pane buttons should not be different, we should keep the colors the same for both pane buttons.
Toggle Version Drchrono - Severity of 2, violated heuristics 4. Minimalist and 5. Memory Load. Too many steps to put in a problem for a patient (13 steps), we have to reduce it in the Sunflower.
Navigating the problem list from left pane to content pane - Severity of 2, violated heuristics 6. Feedback and 10. Closure. There should be clear indication that you completed entering the problem. We have to give feedback to the users.
Update the content pane for problem list - Severity of 1.5, violated heuristics 1. Consistency and 12. Language. When you want update problem it is confusing with changing the status. Should be clearer.
We did not do a usability problem graph because the amount of functionality in the Test Procedure for §170.302. Instead, we selected a severity ratings bar graph (Figure 5) and times of occurrence (figure 6) with the fourteen metrics, which better display important usability problems with each metric.
Figure 5:
Severity of Heuristics Violated for drchrono EHR Test Procedure for §170.302
Figure 6:
Average Severity Rating (blue) and times of occurrence (red) of Heuristics Violated for Drchrono EHR Test Procedure
“SUNFLOWER” PROTOTYPE
In this section, the mock interface prototype will be discussed, based on findings from UFuRT analysis of drchrono’s problem list feature. Acknowledging that tablet-based interface is best understood through an operational example, allowing users to interact with the software, a live mockup was developed using Adobe Flex/Flash Builder to rapidly deploy to a mobile device for testing and evaluation. Due to the licensing cost to develop an iOS, evaluation will be performed on Android-based tablets. As noted earlier, the results from the previous evaluation influence certain design decisions to build the prototype more streamlined and intuitive for the user experience. Following the methodology from the previous example, the UFuRT usability framework analysis will be used to assess the prototype. In later sections, a discussion on comparing the prototype and drchrono will be examined, including observations and lessons learned from developing interfaces for mobile devices.
ANALYSIS OF PROTOTYPE
User Analysis:
The user analysis that was applied to the drchrono software evaluation also applies to the Sunflower prototype. Our end users are the same, but there are a few differences concerning aspects of use.
With the goals achieved and with the reduction of the cognitive load needed to use the prototype, we expect all users to be more motivated in using the system and have more free time, since there will be less effort needed to do the same tasks. Also, we expect the users to use the system more frequently, since it will be more easy and enjoyable to do.
Functional Analysis:
Similar to what was done before with drchrono, Sunflower was composed of two primary screens that the user will encounter when performing the mentioned use case. With each screen, the interface was decomposed further to basic widgets, filtering out the non-essential widgets that do not pertain to the problem list. The two primary screens are Patient Screen – similar to drchrono implementation, this screen is the first screen the end user will see and where he or she will find a patient to modify their records. (Figure 6) and Chart Screen – this screen aggregates the various patient related data ranging from billings, medical history, problem lists, etc. (Figure 7)
Figure 7.
“Chart Screen” - Sunflower Prototype
For comparison, the relevant screens from drchrono are shown immediately next to Figures 6 and 7 (i.e. Figures 1 and 2)
An additional screen, Settings Screen (Figure 8), was was incorporated specifically to customize the Chart Screen to remove functions that the users would less likely need. For example, a clinician who is not responsible for billing may decide to toggle the billing option. An additional screen, Settings Screen (Figure 11), was incorporated specifically to customize the Chart Screen to remove functions that the users would less likely need. For example, a clinician who is not responsible for billing may decide to toggle the billing option.
Figure 11.
Settings Screen - Sunflower Prototype
Aside from the Settings Screen, the same procedure in the previous Functional Analysis was performed on Sunflower’s Patient Screen and Chart Screen – coding essential widgets utilized for the problem list use case, and denoting the function and operations types for each. Table 5 summarizes the findings for each of the screens. The findings from the analysis indicated a high number of domain functions in both the Patient Screen and Chart Screen.
Task Analysis:
Similar to the drchrono evaluation, the GOMS analysis was utilized to study the tasks involved in performing the Use Case for creating, updating, and reading problem items from a problem list. Though unlike the drchrono evaluation, 5 major goals were identified as they relate to the use case scenarios; Navigating to the problem list; Record new problem; Read problem from Active List; Read problem from historical list; Update an existing problem.
Figure 9 summarizes the results showing higher affiliation for cognitive tasks that are physical than mental.
Figure 9.
Cognitive Task Summary from GOMS Task Analysis for Sunflower Prototype
Additionally, 29 nodes and a depth of 3 levels were revealed from Hierarchal Task Analysis for the prototype.
Heuristic Analysis:
On the Heuristic Analysis for the prototype, two evaluators again used the 14 Nielson-Sheneiderman Heuristics to perform our analysis (8), as shown in Table 11. We solved all of the problems in the usability description for drchrono. So there is no graph for the type of heuristic problem ratings. We eliminated the steps for appointment in navigating the problem list. We also eliminated the toggle version of drchrono because it had too many steps. The customization that is in the prototype solves some of the usability problems and gives clinicians more control of the system. We attempted to follow usability laws by keeping the prototype function as simple as possible. In general, clinicians use a limited amount of functionality most of the time as the 80/20 rule or Pareto Principle states.(9)
COMPARISON OF DR CHRONO AND PROTOTPE ANALYSES
Four of Shneiderman’s eight interface design guidelines apply to mobile devices without any explicit changes;(10)
i. Enable frequent users to use shortcuts. Time is more often critical to users of mobile devices, so, reducing the number of operations needed to perform repetitive tasks is important.
ii. Offer informative feedback for every operator action. Feedback should be substantial and clear.
iii. Design dialogs to yield closure, so that sequences of actions should give the user satisfaction of accomplishment and completion.
iv. Support internal locus of control. Design systems so that users feel they are controlling the mobile application and not vice versa.
The remaining 4 require modification for mobile devices;(10)
v. Consistency should be maintained between desktop and mobile applications because users may need to switch between them frequently.
vi. Reversal of actions is more difficult to achieve with mobile application because devices have less memory and wireless connectivity may be susceptible.
vi. Error prevention and simple error handling needs to be rapid-paced and take physical design into consideration.
vii. Reduce short-term memory load because the user has a limited short-term memory and there are potentially more distractions. (11)
Some additional guidelines include;
viii. Design for multiple and dynamic contexts. Additional people, objects, and activities all vie for a user’s attention aside from the mobile application. Implement context-awareness and self-adapting functionalities, where Schmidt says “devices can see, hear, and feel.”(12)
ix. Design for small devices as technology advances and interaction techniques change.
x. Design for limited and split attention, since mobile applications users will be multitasking.
xi. Design for speed and recovery since users may need to quickly change or access functions or applications.
x. Design for “Top-Down” interaction in that presenting information through multilevel or hierarchical mechanisms is preferred to prevent information overload on small screens.(13)
xi. Allow for personalization of the device and the applications to the user so that various preferences amongst users be considered.
xii. Design for enjoyment since emotion plays a large part in the interaction with a device. Try to achieve a positive affective response. (10)
On the surface level, there has been some significant improvement of drchrono in the Sunflower prototype as a result of the UFuRT analysis model. One of those is reduction in the number of screen interaction from 3 screens (Patient Screen, Chart Screen, and Problem Screen) to 2 screens (Patient Screen, Chart Screen). Another significant improvement is streamlined workflows to perform certain tasks. In drchrono, to perform a recording of a new problem, there exist two methods – toggle method and autosearch. With Sunflower, there is one method. The following sections will compare some of additional changes as a result of the UFuRT analysis framework.
Functional Analysis:
Comparing the totals between the two applications Patient Screen and Chart Screen, there appears to be some important improvements. Aside from the Domain functions, Sunflower has comparatively less Operations, Objects, and Overhead functions. In addition, when looking at the each of the applications Operations and Objects, Sunflower has more Operations than Objects, yet drchrono has more Objects than Operations. In the same with Domain and Overhead functions, Sunflower had more Domain functions than Overhead functions while drchrono has more Overhead than Domain functions. (Table 6)
Table 6.
Side-by-side comparison of summary totals (Patient and Chart screens) between drchrono and Sunflower prototype
| Drchrono | Sunflower | |
|---|---|---|
| Operations | 20 | 18 |
| Objects | 27 | 11 |
| Domain Functions | 20 | 20 |
| Overhead Functions | 27 | 10 |
Task Analysis:
While there were more cognitive physical tasks than mental tasks to perform in the problem list use case, the total amount of tasks was significantly lower in Sunflower than drchrono. Interestingly, the ratio between Physical Tasks to Mental Tasks is 5 to 2 in both drchrono and Sunflower. (Table 7) Hierarchal analysis between the drchrono and Sunflower revealed less nodes and a more shallow depth than drchrono. (Table 8)
Table 7.
Comparison of GOMS Task Analysis Results
| Drchrono | Sunflower | |
|---|---|---|
| Physical Tasks | 40 | 25 |
| Mental Tasks | 16 | 10 |
| Total Tasks | 56 | 35 |
Table 8.
Comparison of Hierarchal Task Analysis Results
| Drchrono | Sunflower | |
|---|---|---|
| Nodes | 43 | 29 |
| Depth | 4 | 3 |
Contextual Analysis:
The working environment in which a user interacts with both the application and the device on which it runs affects the ultimate usability of the application. A contextual analysis was performed on drchrono running on the iPad to ensure that both were compatible with the intended use environment and to try and identify any additional usability issues that may be addressed by our prototype. Twelve individuals representing 3 different work roles (physician, nurse practitioner, and bedside nurse) were observed while interacting with drchrono running on the Apple iPad 2. They were given a short script which instructed them to locate a test patient and then add, amend, and delete a problem from the problem list. The script was enacted in 3 common work environments in a neonatal intensive care unit (NICU) by each individual. Afterwards, the participants filled out a survey that asked specifically about contextual issues associated with the application itself, and the device on which it ran, and some additional, more general questions about usability. The participants were also allowed to make any additional comments.
When asked to rate how comfortable they felt with the weight, screen size, and display qualities (color, brightness, contrast, resolution) on a scale from 1 to 5 (1 being most comfortable), the participants’ mean ratings for the iPad 2 in their own work environment were 2.08, 2.42, and 1.25, respectively. Many commented that the screen was smaller than those they currently used in the NICU, but that they did not feel like it unduly affected the usability of the application. They also reported that they were very satisfied with the WiFi access (mean of 1.08 on a satisfaction scale of 1 to 5, 1 being most satisfied) even when tested in areas of the NICU known for poor reception. The most common additional comment about the device was a concern for keeping it sterile, as it was a touch-responsive device that would be moved from patient to patient. Many other participants, however, pointed out that the wipe-able iPad display would likely be easier to keep clean than the hardware currently in use, i.e. standard keyboard and mouse.
The participants also reported that they were comfortable with the touch-screen interface and that the existing drchrono interface was relatively easy to navigate with respective means of 1.17 on a comfort scale of 1 to 5 (1 being most comfortable) and a 2.42 on an ease scale of 1 to 5 (1 being most easy.) Overall 83% (10 out of 12) participants reported that they would consider the application potentially helpful in their daily practice. When asked for additional comments or suggestions, some participants responded that they would like the system to allow for additional customization, arranging the problems in the list by acuity or by body-system.
CONCLUSION AND FUTURE DIRECTION
While the prospect of introducing mobile-based EHR applications appears attractive and lucrative, there are some important challenges learned in designing interfaces for mobile touch applications. One consideration is various form sizes and dimensions when designing interfaces. The user experience for a tablet is slightly unique than the mobile phone due to amount of real estate space on a tablet. An example would be the use of the split navigator layout that is prevalent on various tablet applications. This is also complicated with different manufacturers using different screen sizes ranging from 5 inches to 10 inches, along with various screen DPI.
Another lesson learned is that certain devices have their own idiosyncrasies, mainly due to their operating system. For example, on the Android devices, a long press gesture on table list view of items provides the user an option to see a pop screen with items, however the Apple iOS does not offer that particular user gesture for table list views. Use of a platform agnostic development tool, like Adobe Flex/Flash Builder restricted specific gestures for particular operating system since the tool had “one shoe fits all” approach to developing mobile applications, which may be beneficial for usability and economic reasons.
Finally another challenge to acknowledge is speed and reliable connectivity with mobile applications. Modern day mobile applications utilize cloud-based architecture, where the mobile app makes a request to a data server and receives data from the data server. How responsive an application is may influence some user’s opinion on usability of the software. Drchrono fetches data either for logging in, retrieving a patient, or searching an ICD problem based on user input. Firsthand experience revealed some lag, possibly from connectivity. Nonetheless, some routine that involves caching the data locally may alleviate some of the responsiveness issues.
UFuRT is a valuable tool in helping interface designers to mitigate user interface challenges as evident in the results of this paper. While the focus was on one use case, namely the retrieve, record, and update of a problem list (§170.302 (c) Maintain upto-date problem list), further analysis on drchrono in relation to other use cases may be recommended to show additional improvement. Another area yet to explore is the use of gestures, which are unique to touch screen devices, to offer solutions in improving the usability of mobile devices.
Figure 1.
“Patient Screen” from Drchrono EHR
Figure 2.
“Chart Screen” from Drchrono EHR
Figure 4:
Summary of GOMS Task Analysis for Drchrono HER
Figure 6.
“Patient Screen” - Sunflower Prototype
Table 2.
Summary Table of Functional Analysis for Drchrono EHR
| Patient Screen | Chart Screen | Problem Screen | Totals | |
|---|---|---|---|---|
| Operations | 5 | 15 | 8 | 28 |
| Objects | 13 | 14 | 1 | 28 |
| Domain Functions | 5 | 15 | 7 | 27 |
| Overhead Functions | 13 | 14 | 2 | 29 |
REFERENCES:
- 1.Gunter TD, Terry NP. The Emergence of National Electronic Health Record Architectures in the United States and Australia: Models, Costs, and Questions in 2005. J Med Internet Res 7. doi: 10.2196/jmir.7.1.e3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Department of Health and Human Services . Final Rule. Jul 28, 2010. 45 CFR Part 170 Health Information Technology: Initial Set of Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology. [PubMed] [Google Scholar]
- 3.Zhang J, Butler K. UFuRT: a work-centered framework and process for design and evaluation of information systems. HCI Int Proc. 2007:1–5. [Google Scholar]
- 4.Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis. International journal of Medical Informatics. 2010;97:501–506. doi: 10.1016/j.ijmedinf.2010.04.001. [DOI] [PubMed] [Google Scholar]
- 5.Stanton N. Hierarchical Task Analysis: Developments, Applications and Extensions. School of Engineering and Design, Brunel University; Uxbridge, Middlesex, UK: [DOI] [PubMed] [Google Scholar]
- 6.Zhang J, Norman D. Representations in Distributed Cognitive Tasks. Cognitive Science. 1994;18:87–122. [Google Scholar]
- 7.Nielsen J. Finding usability problems through heuristic evaluation. Proc. ACM CHI’92; Monterey, CA. 3–7 May; 1992. pp. 373–380. [Google Scholar]
- 8.Zhang J, Johnson TR, Patel VL, Paige DL, Kubose T. Using usability heuristics to evaluate patient safety of medical devices. Journal of Biomedical Informatics. 2003;36:23–30. doi: 10.1016/s1532-0464(03)00060-1. [DOI] [PubMed] [Google Scholar]
- 9.Bookstein Abraham. Informetric distributions, part I: Unified overview. Journal of the American Society for Information Science. 1990;41:368–375. [Google Scholar]
- 10.Shneiderman B. Designing the User Interface - Strategies for Effective Human-Computer Interaction. Addison-Wesley; 1998. [Google Scholar]
- 11.Chan S, Fang X, Jack Brzezinski, Zhou Y, Xu S, Lam J. Usability For Mobile Commerce Across Multiple Form Factors. Journal of Electronic Commerce Research. 2002;3:187–199. [Google Scholar]
- 12.Schmidt A. Implicit Human Computer Interaction Through Context. Personal Technologies. 2000;4(2&3):191–199. [Google Scholar]
- 13.Brewster S. Overcoming the Lack of Screen Spaces on Mobile Computers. Personal and Ubiquitous Computing. 2002;6:188–205. [Google Scholar]












