Abstract
Despite the many diabetes applications available, the rate of use is low, which may be associated with design issues. This study examined app usability compliance with heuristic design principles, guided by the Self-determination Theory on motivation. Four top-rated commercially available apps (Glucose Buddy, MyNetDiary, mySugr, and OnTrack) were tested for data recording, blood glucose analysis, and data sharing important for diabetes competence, autonomy, and connection with a healthcare provider. Four clinicians rated each app’s compliance with Nielsen’s 10 principles and its usability using the System Usability Scale. All four apps lacked one task function related to diabetes care competence or autonomy. Experts ranked app usability rated with the System Usability Scale: OnTrack (61) and Glucose Buddy (60) as a “D” and MyNetDairy (41) and mySugr (15) as an “F.” A total of 314 heuristic violations were identified. The heuristic principle violated most frequently was “Help and Documentation” (n = 50), followed by “Error Prevention” (n = 45) and “Aesthetic and Minimalist Design” (n = 43). Four top-rated diabetes apps have “marginally acceptable” to “completely unacceptable.” Future diabetes app design should target patient motivation and incorporate key heuristic design principles by providing tutorials with a help function, eliminating error-prone operations, and providing enhanced graphical or screen views.
Keywords: Diabetes app, Heuristic evaluation, mHealth, Motivation, Self-determination Theory, Usability
Diabetes is an increasingly prevalent chronic disease with significant morbidity and healthcare costs.1 Key strategies for better diabetes control include establishing a personalized care plan, adhering to medical nutrition therapy, monitoring blood glucose (BG), and taking medications.2 Smartphones can deliver a diabetes care plan in the form of an app to encourage patients to adopt healthy behaviors.3 Patients who used diabetes apps to track BG and carbohydrate (carb) intake had a greater diet adherence rate compared to those who used a paper log.4 Small clinical trials have shown that the use of a diabetes app improved glycemic control with 0.4% to 1.9% in HbA1c reduction.5–7 More than 1000 diabetes apps are available, but the rate of diabetes app usage is low because of disappointing app experiences.8 Common app functions include BG reading entry, carb intake entry, BG analysis, emailing BG logs, alert reminders, carb counting tools, social media access, and lowcarb recipes. For some users, multiple app functions can make the app too complicated to use.9 A cluttered app screen of BG reports makes it difficult to view and understand the reports.10 Users did not like apps that required multiple steps and too much time to use.11 Patient ratings on diabetes app usability range from poor to good; some users liked multiple functions, and others did not.10–12
Usability is defined by the International Organization for Standardization (ISO) 9241–11 as the extent to which a product can be used by a specific person in a specific context to achieve realistic goals of effectiveness, efficiency, and satisfaction.13 In this context, diabetes app usability includes the degree to which users, such as patients, are satisfied using the app for a particular task (eg, monitoring BG readings) effectively and efficiently. In one study, patients with diabetes tested the usability of a diabetes app and its accompanying Web site.10 They found 19 usability problem themes from 117 error incidents and critical severity problems for BG data import and review.10 Recent review of diabetes apps showed a wide range of diabetes app usability from acceptable to unacceptable.14,15 Most health apps, including diabetes apps, lack the application of health behavior theory in their design.16 To fill this critical gap, this study applied the Self-determination Theory (SDT), a health behavior theory of motivation and personality, as the basis for app design and usability evaluation.17
Motivation is a strong influence for patients with diabetes to engage in self-management behaviors18 and is enhanced by the satisfaction of inherent psychological needs of competence, autonomy, and relatedness.17 Interventions addressing these needs yield improved adherence to diet, medication, and exercise regimens.19 Applying SDT to diabetes selfmanagement, competence refers to the patient’s desire to be confident in managing diabetes and keeping BG in target range.20 Diabetes apps can target competence by displaying a report of BG patterns by “day” and by “meal” that will in turn increase the patient’s knowledge pertaining to diabetes numeracy and understanding BG numbers. Autonomy means that patients desire empowerment through choices or options to change their behavior.20 Diabetes apps can facilitate autonomy by providing BG and carb intake pattern reports for daily meals. Patients can see which meal options require better carb control and choose to change their diet or activities. Relatedness is the patient’s desire to care and be cared for.21 Patients adopt behaviors better when they feel supported by and connected to people they trust.20 In the mobile health technology setting, this can be the feeling of “connection or relatedness” when patients can communicate and share information with others. The goal of this study was to assess the usability of diabetes apps designed to help patients manage their diabetes based on the SDT. App functions examined were those targeting diabetes care competence, autonomy, and data sharing necessary to connect with a healthcare provider.
METHODS
Study Design
This study was an expert evaluation of top-rated diabetes apps on their compliance with heuristic designs and usability satisfaction rating. To assess app usability in the real world, we tested diabetes apps commercially available to the public, including four apps (MyNetDiary, MyNetDiary Inc., Marlton, NJ; Glucose Buddy, Azumio, Redwood City, CA; OnTrack Diabetes, Remedy Health, New York, NY; and mySugr, Roche Holdings AG, Encinitas, CA) selected from the “Best Diabetes Apps of 2016” listed in Healthline, a health forum.22 App selection criteria were based on the greatest number of available app functions in 2016 (Table 1). Three apps are available in both Android (Google, Mountain View, CA) and iOS (Apple, Cupertino, CA), but OnTrack Diabetes is only available in Android. To achieve even distribution among app testing platforms, MyNetDiary and Glucose Buddy were tested on iPhone 6 (Apple, Cupertino, CA) and OnTrack and mySugr on Samsung 5S (Samsung, Seoul, Korea). The University of Minnesota Institutional Review Board reviewed this project and determined no approval was required because the study involved no human subject research.
Table 1.
Characteristics of Selected Apps
App | OS Tested | Cost | Five-Star Ratinga | User Review | PII Login | App Functions | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
CHO | ACT | Med | BG | BG Report | ||||||||
Day | Meal | |||||||||||
Glucose Buddyb | iOS | $0 | 4.5 | 6512 | ♦ | ♦ | ♦ | ♦ | ♦ | ♦ | ||
OnTrack | Android | $0 | 4.25 | 5997 | ♦ | ♦ | ♦ | ♦ | ♦ | ♦ | ||
MyNetDiaryb | iOS | $8.99c | 4.5 | 8 | ♦ | ♦ | ♦ | ♦ | ♦ | ♦ | ♦ | |
mySugrb | Android | $0 | 5 | 20 | ♦ | ♦ | ♦ | ♦ | ♦d | ♦ | ♦ |
Abbreviations: ACT, physical activity entry; App, application; CHO, carb entry; Day, day of the week; Med, medication entry; OS, Operating System; PII, personal identifiable information.
Rating and user review were obtained on June 10, 2016, from Google Play or iTunes, for the platform the app was tested on.
Available in both Android and iOS operating system.
Monthly subscription for access to food database.
Synced with iHealth glucometer (iHealth Labs, Sunnyvale, CA) in 2016 and synced with Accu-Chek (Roche Diabetes Care, Basel, Switzerland) in 2017.
Heuristic evaluation tests individual app functions; experts detect a usability problem if the app fails to follow a set of heuristic or intuitive design principles and rate the severity of the violation.23 Four informatics and clinical experts were selected based on their clinical and methodological knowledge. Three to five single-domain experts can discover 74% to 87% of usability problems, whereas dual-domain experts can detect 81% to 90% of usability problems.23,24 We recruited two single-domain experts (one family physician and one diabetes nurse educator) and two dual-domain experts (two internists who are also informatics researchers).
Experts used a checklist for intuitive design modified for diabetes apps that was originally adapted from Nielsen’s25 10 heuristics used for a healthy eating apps evaluation. Expert evaluators inspected the usability of seven app functions: (1) carb intake, (2) exercise activity, (3) insulin dose, (4) BG reading, (5) BG report by days of the week (important for competence to help patients plan behavior change for each day), (6) BG report for each meal (important for autonomy to help patients see which meal option to change), and (7) emailing a BG report (important for connectivity to help patients share data and communicate with others about their diabetes).
Procedures
All evaluators received standardized training for heuristic evaluation and app testing procedures through a PowerPoint (Microsoft, Redmond, WA) presentation and a study manual in June 2016. All four apps (MyNetDiary and Glucose Buddy on iPhone; mySugr and OnTrack Diabetes on Samsung) were loaded with the same dummy data, which was created in consultation with a diabetes educator to simulate a 5-week clinical data set of carb intake, physical activity, medication use, and BG readings. Each evaluator had a random app testing order generated at Random.org (Randomness and Integrity Services Ltd., Dublin, Ireland). The evaluators spent about 20 minutes becoming familiar with each app. Next, they used the heuristic checklist to review for any violation and comment on a solution. In the last step, the evaluator completed a satisfaction questionnaire to rate the overall app usability. The average duration of app testing was 40 minutes, with a 45-minute break after two apps were tested. Within 2 weeks of completing the independent app evaluations (July 2016), all four evaluators and the first author met as a group to discuss and reach a consensus for the usability problem severity rating.
Measurements
The heuristic checklist has 10 intuitive design principles, and the severity of each violation is rated as minor, moderate, major, or catastrophic (1–4). Overall app usability was rated with the System Usability Scale (SUS) to determine the user’s satisfaction on the ease of use.26 The SUS is widely used for evaluating technology product usability and has good reliability with Cronbach’s α coefficient of .91 and good validity with a loading factor of 0.3 and higher.27,28 The 10-item questionnaire is administered immediately after user interaction to rate satisfaction on a Likert-type 5-point scale from strongly disagree to strongly agree (0–4). Responses to each item are summed and multiplied by 2.5 for scores ranging from 0 to 100. System Usability Scale scores of 90 to 100 are excellent (“A” grade); 80 to 89 are good (“B” grade); and 70 to 79 are regarded as acceptable (“C” grade).29 Scores between 60 and 69 are considered high marginally acceptable (“D” grade). Scores between 50 and 59 are low marginally acceptable (“F” grade). Scores less than 50 are unacceptable (“F” grade) for scoring less than 50.29
Data Analysis
Heuristic violations, usability problem severity ratings, and SUS scores were analyzed descriptively (means and SDs) in SAS version 9.4 (SAS Institute, Inc, Cary, NC). The heuristic violation rate was calculated by dividing the violation for one app function by the app’s total violations. The same comparison was used to calculate the severity of usability problems. Usability problem descriptions were transcribed and cross-referenced with notes from field observations and group discussions. A summary of usability problems was compiled based on the types of app functions (data entry, report display, email report, and help function) and heuristic violations.
RESULTS
Overall Usability
All apps failed to reach an acceptable usability SUS rating of 70 or greater (Table 2): 15 and 41 for mySugr and MyNetDiary (reflecting an “F” grade usability) and 60 and 61 for Glucose Buddy and OnTrack Diabetes (reflecting a “D” grade usability). The average app usability evaluation time was 52 minutes. OnTrack Diabetes took the least amount of time (45 minutes) with the highest overall satisfaction, followed by Glucose Buddy (46 minutes), MyNetDiary (58 minutes), and mySugr (59 minutes). MyNetDiary, Glucose Buddy, and OnTrack Diabetes lacked a BG analysis report sorted by days of the week, which is important as a support for patient competence to plan health behavior change, and mySugr lacked a BG analysis report sorted by meals, which is important to support patient autonomy in meal modification.
Table 2.
Usability Rating of Four Top-Rated Apps
Usability | OnTrack Diabetes | Glucose Buddy | MyNetDiary | mySugr |
---|---|---|---|---|
SUS (mean ± SD) | 61 ± 28 | 60 ± 19 | 41 ± 21 | 15 ± 5 |
Usability problem severity | ||||
Mean (maximum of 4) | 3.3 | 2.8 | 2.6 | 2.7 |
Catastrophic (n = 77) | 41 | 21 | 9 | 6 |
Major (n = 117) | 16 | 17 | 32 | 52 |
Minor (n = 108) | 18 | 33 | 26 | 31 |
Cosmetic (n = 12) | 2 | 3 | 6 | 1 |
Heuristic violation by function (count, %) | ||||
Total count (n = 314) | 77 | 74 | 73 | 90 |
Carb entry (n = 68) | 17 (22%) | 12 (16%) | 17 (24%) | 22 (25%) |
Activity entry (n = 65) | 13 (17%) | 17 (23%) | 15 (21%) | 20 (22%) |
Medication entry (n = 65) | 14 (18%) | 17 (23%) | 12 (16%) | 22 (25%) |
BG entry (n = 41) | 8 (11%) | 10 (14%) | 12 (16%) | 11 (12%) |
BG trends for days of the week (n = 13) | NA | NA | NA | 13 (14%) |
BG report for meals (n = 32) | 11 (14%) | 12 (16%) | 9 (12%) | NA |
Email report (n = 30) | 14 (18%) | 6 (8%) | 8 (11%) | 2 (2%) |
Heuristic violation per principle | ||||
Help and documentation (n = 50) | 16b | 13a | 4 | 12b |
Error prevention (n = 45) | 17a | 11b | 5 | 7 |
Aesthetic and minimalist design (n = 43) | 13c | 8 | 5 | 15a |
Flexibility and efficiency of use (n = 37) | 2 | 10c | 10b | 12b |
Recognition rather than recall (n = 29) | 5 | 10c | 7 | 5 |
Match between app and the real world (n = 26) | 5 | 5 | 9c | 11c |
User control and freedom (n = 23) | 5 | 5 | 13a | 8 |
Consistency and standards (n = 22) | 3 | 4 | 7 | 8 |
Feedback and visibility (n = 20) | 6 | 3 | 4 | 7 |
Help users recognize, diagnose, and recover from errors (n = 19) | 5 | 5 | 9 | 5 |
Abbreviation: NA, not available (lacking this function).
Ranked the highest heuristic violation within the app.
Ranked the second highest heuristic violation within the app.
Ranked the third highest heuristic violation within the app.
Heuristic Design Violations
A total of 314 heuristic violations or usability problems were identified (Table 2). More than half (62%) were very serious because their severity ratings were either major (37%) or catastrophic (25%). OnTrack Diabetes had the highest SUS score (61) and ranked the third in heuristic design compliance with 77 violations. The least heuristic-compliant was mySugr because it had the greatest number of violations (n = 90) and the lowest usability SUS score (15). The heuristic principle violated the most often was “Help and documentation,” followed by “Error prevention” and “Aesthetic and minimalist design.”
The four apps share common major heuristic violations for “Help and documentation,” “Error prevention,” and “Aesthetic and minimalist design” (Table 3). According to the experts, self-help tips or resources were too long or not useful for end users who are adult patients with diabetes. Help functions were often not available. The apps had many error-prone conditions that allowed users to easily make mistakes and did not have an action confirmation step when a data entry was outside the possible parameters. For example, OnTrack Diabetes allowed users to enter “1,000,000” mg/dL for a BG reading. Glucose Buddy permitted an entry such as “2” for BG reading and “5000 units” for insulin dose. The carb entry function was the most inefficient function because it had the highest number of heuristic violations, and experts found it to be the task most prone to errors. Only MyNetDiary had a food database, but it was not alphabetized, resulting in excessive scrolling and browsing. Another frequent violation involved the app screen, which was unattractive and simplistic in OnTrack Diabetes (users found it boring) or excessively busy in mySugr (users could not find icons). Small fonts and poor color contrast made it hard for users to see whether they were in the right screen.
Table 3.
Usability Problems Explained by Heuristic Violations
App Functions | Usability Problem Examples | Heuristic Principles Violated |
---|---|---|
Entries for Carb, Activity, Med, and BG | Difficult to review completed entry | Feedback and visibility |
“Green apple” and “M” listed as meal | Match between app and the real world | |
Difficult to edit entry and find back button | User control and freedom | |
Allows any amount, eg, “1,000,000” | Error prevention | |
“Check symbol” for SAVE is not standard | Consistency and standards | |
Data entry requiring multiple steps | Recognition rather than recall | |
Too many food choices, not in order | Flexibility and efficiency of use | |
Busy screen with poor color contrast | Aesthetic and minimalist design | |
No error message for wrongful entry | Help users recognize, diagnose, and recover from errors | |
BG reports for days and meals | Color for normal is no color | Consistency and standards |
Graph legend has too many colors | Aesthetic and minimalist design | |
Report is hidden under “Diabetes Overview” | Flexibility and efficiency of use | |
Meal report requires return to logbook | Recognition rather than recall | |
Email report | Hidden email tab in report or setting icon | Feedback and visibility |
Icon for previewing report is “export” | Match between app and the real world | |
Cannot return to home page after email is sent | User control and freedom | |
Export and email functions are mixed together in | Consistency and standards | |
Hard to find list to select report for email | Recognition rather than recall | |
Help function | Unavailable or not intuitive to use | Help and documentation |
Expert evaluators suggested solutions to correct heuristic violations (Table 4), including the use of words familiar to patients such as “blood sugar,” allowing a “back” button, providing access to a food database, showing an error message when an entry is outside parameters, and using a pop-up reminder to confirm that an entry is complete. Blood glucose analysis trends and patterns reports should not be hidden as a subheading. Reports should be easily visible in larger fonts with good color contrast. The email function should be available on the home screen and allow users to preview the report before sending. Help functions should include a quick search and patient-friendly tutorial video.
Table 4.
App Design Recommendations by Experts Based on Nielsen’s Heuristics
App Function | Future Design Recommendation | Nielsen’s Heuristics |
---|---|---|
Entries for Carb, Activity, Med, and BG | Feedback if the entry is complete or correct | Feedback and visibility |
Icon or use wording that is intuitive to patients | Match between app and the real world | |
Enable edit, offer “Back” button, give extensive food choices linked to an accurate food database | User control and freedom | |
Set up data entry parameter, spell check, preset tabs for entry, and offer food database access | Error prevention | |
Standard carb counting (gram or carb choice) | Consistency and standards | |
Simplify data entry to 1–2 steps | Recognition rather than recall | |
Link carb counting tool with food database | Flexibility and efficiency of use | |
Simple interface and icons/graphic with good contrast and attractive style | Aesthetic and minimalist design | |
Error message when data entry is outside parameter; pop-up reminder to confirm entry | Help users recognize, diagnose, and recover from errors | |
BG trends for days; BG trends for meals | Provide BG in-target range and out-of-range frequency | Consistency and standards |
Stop light color zone for goal range; Visible legend key for the days of the week and meals | Aesthetic and minimalist design | |
Easy navigation with minimal required steps | Flexibility and efficiency of use | |
Preview customizable reports before email | Recognition rather than recall | |
Email report | Email should be a main icon that is visible | Feedback and visibility |
Avoid jargon of data type “csv, html, xml” | Match between app and the real world | |
Return to home screen after canceling email | User control and freedom | |
Offer standardized list of data/report to be shared | Consistency and standards | |
Help | Patient-friendly short tutorial and help function | Help and documentation |
DISCUSSION
After testing four top-rated diabetes apps guided by the SDT on motivation, clinician experts were concerned about poor usability. This overall finding is consistent with a previous study that reported diverse patients lacked the confidence, felt frustration with app designs and navigation, and had less interest in using apps to support self-management.30 Two focus groups of older adult patients (n = 20) reported concerns with app usability problems and rated SUS scores low, of 48.31 A crossover randomized trial tested actual use of mySugr and OnTrack Diabetes (n = 92) and reported an “F” grade because of a score of 55 (less than 60) and a “D” grade because of a score of 68 (between 60 and 69).32 Only mySugr displayed a BG report sorted by day of the week, which could help promote competence in diabetes care. Patients could benefit from seeing a BG report that showed which day of the week they have out-of-range BG readings. mySugr displayed an average BG and recorded BG for each day but did not show trends. The contents of this report were too small and cluttered, making it difficult to read. Despite the availability of a BG analysis report for meals in three apps (OnTrack Diabetes, Glucose Buddy, and MyNetDiary), which could help promote autonomy, the usability of this function was rated poor. Displayed reports in all the apps were hard to read: it was not clear what the color coding referred to, and it was unknown which color reflected abnormal BG readings. Other reports, like carb intake, were not useful, because reporting 3 carbs for a meal is meaningful only if the report displays whether carb intake was excessive and its effect on out-of-range BG levels. Experts felt patients would have difficulty using the tested apps because usability ranged from marginally acceptable to completely unacceptable. This finding is consistent with another study of four commercial apps, including Diabetes Connect and InCheck, in which 10 patients performed 185 tasks and reported most app functions were difficult to use.30 In another study with patient raters, the experts reported feeling frustrated when it took multiple steps to navigate between screens and complete tasks.10,11
Through heuristic evaluation, experts identified major usability problems. Without incorporating “help and documentation” for self-learning, it will be difficult for users to adapt to an app. A lack of “error prevention” in app designs leads to frequent erroneous data that could lead to faulty analysis reports. Such erroneous conditions were two times higher in data entry functions compared to functions for generating a BG analysis report or sharing a report through email in this study. This finding is consistent with research that tested eight diabetes apps and reported significant problems with data entry functions.33 The carb entry function received the worst usability rating because it had the highest number of heuristic violations, consistent with prior studies that reported carb entry to be inefficient and require too many keystrokes.33,34
Cluttered app screens violated the “aesthetic and minimalist design” heuristic and were consistent with another study showing that diabetes apps do not display images clearly enough to facilitate learning.35 Blood glucose reports were in small font and cluttered. Patients who tested a diabetes mHealth system reported the same problem: glucose data views were difficult to understand.10 A useful diabetes app should create BG reports to facilitate learning (increase diabetes numeracy) by displaying BG trends and patterns and clearly show which BG readings were normal or abnormal. Blood glucose analysis reports were not readily available on the home screen and required multiple steps to find. OnTrack required five clicks: report, logbook, action, view, and OK. This violates the “recognition rather than recall” heuristic in that users should not have to remember much to navigate inside the app. Limited or no customizable font size or report enlargement violates “flexibility and efficiency of use” because some users require larger font for reading.
Limitations of this study included testing each app on one platform, experts who were not familiar with both operating systems, and using the Wi-Fi connection only. Although the three diabetes apps are available on both iOS and Android platforms, their app functions vary. For example, MyNetDiary has carb entry tabs in its iOS home screen, but they are hidden inside its Android home screen’s “+” icon. This may have caused a biased rating whenever the app was tested in the platform with which the expert was more familiar. However, new diabetes app users may have similar difficulties because of their lack of familiarity with the app rather than the operating system. This study can be improved by testing each app in all available operating systems and using both Wi-Fi and cellular carrier networks to mimic the user environment. Cloud-based app data storage relies on Internet strength. A lack of cellular network capacity may slow down app operation and influence usability satisfaction.
The study findings are not applicable to apps that have automatic data upload. Automatic BG reading upload would eliminate manual data entry errors. Newer models of glucometer devices have Bluetooth features that can pair with a phone or use a separate device to link a phone and a glucometer (eg, Glooko [Mountain View, CA] is able to link a phone to 12 or more glucometers). During a pilot study, the Glooko sync device setup was very cumbersome, taking more than 30 minutes, and its sync features were not reliable after 3 months of use. Therefore, this study omitted apps with a glucometer sync feature.
This study has several important implications for future diabetes app design, development, and improvement. In particular, intuitive design must be used to maximize patient use. Blood glucose reports are important as a way to increase patient competence in diabetes care by identifying and presenting a patient’s personal BG trends. Patients can also benefit from seeing a BG report for all meals, which could promote autonomy and empower patients to make the best meal choices in line with physical activities and medication use. Following Nielsen’s25 10 heuristics principles could improve diabetes app design and promote better diabetes app use. Future app usability studies should include automatic data upload if it has adequate reliability.
CONCLUSIONS
The four top-rated diabetes apps evaluated by experts in this study have significant room to improve usability. Intuitive design that incorporates good tutorials, a robust help search function, elimination of error-prone conditions, and an aesthetic minimalist design using best practices in graphics, color contrast, and report view will make apps patient friendly. App designers and developers should also incorporate health behavior theories to help motivate patients and provide education to link human behavior thinking to taking action in better diabetes self-management. An app simply providing data information input and output function is insufficient to engage patients.
Future research on patient adaption of a mHealth tool should include patient training needs, barriers during actual use, and facilitators for long-term use. Caregivers and clinicians play important roles in helping patients engage in positive health behaviors in self-management. Thus, input from caregivers and clinicians should be considered, such as (1) what caregivers perceive to be the pros and cons of app use at home and within their social environment; (2) what burden do clinicians experience when accessing the app and using app data within a busy clinic workflow; and (3) what are the best practices for clinicians, in terms of usability and interpretation of app data, to guide or update a patient’s diabetes treatment plan.
Acknowledgments
This study was supported by the Robert Wood Johnson Foundation Future of Nursing.
Footnotes
The authors have disclosed that they have no significant relationships with, or financial interest in, any commercial companies pertaining to this article.
References
- 1.Ozieh MN, Bishu KG, Dismuke CE, Egede LE. Trends in health care expenditure in U.S. adults with diabetes: 2002–2011. Diabetes Care. 2015;38(10): 1844–1851. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.American Diabetes Association. Standards of medical care in diabetes 2016. Diabetes Care. 2016;39(January): S1–S109.26696671 [Google Scholar]
- 3.Heinrich E, Schaper N, de Vries N. Self-management interventions for type 2 diabetes: a systematic review. European Diabetes Nursing: The Official Journal of the Federation of European Nurses in Diabetes. 2010;7(2): 71–76. [Google Scholar]
- 4.Padhye NS, Jing Wang. Pattern of active and inactive sequences of diabetes self-monitoring in mobile phone and paper diary users. 37th Annual International Conference IEEE Engineering in Medicine and Biology Society. 2015;2015: 7630–7633. [DOI] [PubMed] [Google Scholar]
- 5.Quinn CC, Shardell MD, Terrin ML, Barr EA, Ballew SH, Gruber-Baldini AL. Cluster-randomized trial of a mobile phone personalized behavioral intervention for blood glucose control. Diabetes Care. 2011;34(9): 1934–1942. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Waki K, Fujita H, Uchimura Y, et al. DialBetics: a novel smartphone-based self-management support system for type 2 diabetes patients. Journal of Diabetes Science and Technology. 2014;8(2): 209–215. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Orsama AL, Lähteenmäki J, Harno K, et al. Active assistance technology reduces glycosylated hemoglobin and weight in individuals with type 2 diabetes: results of a theory-based randomized trial. Diabetes Technology & Therapeutics. 2013;15(8): 662–669. [DOI] [PubMed] [Google Scholar]
- 8.Research2Guidance. Diabetes app market report 2014. [online news release]. http://www.reportsnreports.com/reports/276919-diabetes-appmarket-report-2014.html. Published 2014. Accessed May 25, 2016.
- 9.Arnhold M, Quade M, Kirch W. Mobile applications for diabetics: a systematic review and expert-based usability evaluation considering the special requirements of diabetes patients age 50 years or older. Journal of Medical Internet Research. 2014;16(4). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Georgsson M, Staggers N. An evaluation of patients’ experienced usability of a diabetes mHealth system using a multi-method approach. Journal of Biomedical Informatics. 2016;59: 115–129. [DOI] [PubMed] [Google Scholar]
- 11.Georgsson M, Staggers N. Quantifying usability: an evaluation of a diabetes mHealth system on effectiveness, efficiency, and satisfaction metrics with associated user characteristics. Journal of the American Medical Informatics Association. 2016;5–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Alanzi TM, Istepanian RSH, Philip N. Usability study of mobile social networking system among Saudi type 2 diabetes patients (SANAD). 2nd Middle East Conference on Biomedical Engineering 2014;297–300. [Google Scholar]
- 13.Barum CM. Usability Testing Essentials Ready, Set… Test! Burlington, MA: Elsevier; 2011. [Google Scholar]
- 14.Fu H, McMahon SK, Gross CR, Adam TJ, Wyman JF. Usability and clinical efficacy of diabetes mobile applications for adults with type 2 diabetes: a systematic review. Diabetes Research and Clinical Practice. 2017;131: 70–81. [DOI] [PubMed] [Google Scholar]
- 15.Veazie S, Winchell K, Gilbert J, et al. Rapid evidence review of mobile applications for self-management of diabetes. Journal of General Internal Medicine. 2018;33(7): 1167–1176. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Riley WT, Rivera DE, Atienza AA, Nilsen W, Allison SM, Mermelstein R. Health behavior models in the age of mobile interventions: are our theories up to the task? Translational Behavioral Medicine. 2011;1(1): 53–71. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Ryan RM, Deci EL. Self-determination Theory and the facilitation of intrinsic motivation, social development, and well-being. The American Psychologist. 2000;55(1): 68–78. [DOI] [PubMed] [Google Scholar]
- 18.Williams GC, Freedman ZR, Deci EL. Supporting autonomy to motivate patients with diabetes for glucose control. Diabetes Care. 1998;21(10): 1644–1651. [DOI] [PubMed] [Google Scholar]
- 19.Johnson VD. Promoting behavior change: making healthy choices in wellness and healing choices in illness-use of Self-determination Theory in nursing practice. The Nursing Clinics of North America. 2007;42(2): 229–241. [DOI] [PubMed] [Google Scholar]
- 20.Ryan R, Patrick H, Deci E, Williams G. Facilitating health behaviour change and its maintenance: interventions based on Self-determination Theory. European Health Psychologist. 2008;10: 2–5. [Google Scholar]
- 21.Deci EL, Ryan RM. Intrinsic Motivation and Self-determination in Human Behavior. New York: Plenum Publishing Co; 1985. [Google Scholar]
- 22.Schaefer A The Best Diabetes Apps of 2016. http://www.healthline.com/health/diabetes/top-iphone-android-apps#2. Published 2016. Accessed June 1, 2016.
- 23.Nielsen J Finding usability problems through heuristic evaluation. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems—CHI ‘ 92. 1992;373–380. [Google Scholar]
- 24.Molich R, Nielsen J. Improving a human-computer dialogue. Communications of the ACM. 1990;33(3): 338–348. [Google Scholar]
- 25.Watkins I, Kules B, Yuan X, Xie B. Heuristic evaluation of healthy eating apps for older adults. Journal of Consumer Health on the Internet. 2014;18(2): 105–127. [Google Scholar]
- 26.Brooke J, Jordan PW, Thomas B, Weerdmeester BA, McClelland IL. SUS: a quick and dirty usability scale. https://hell.meiert.org/core/pdf/sus.pdf. Published 1996. Accessed March 21, 2018.
- 27.Bangor A, Kortum PT, Miller JT. An empirical evaluation of the System Usability Scale. International Journal of Human Computer Interaction. 2008;24(6): 574–594. [Google Scholar]
- 28.Lewis JR, Sauro J. The factor structure of the system usability scale. Lecture Notes in Computer Science (including) Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics. 2009;5619 LNCS: 94–103. [Google Scholar]
- 29.Bangor A, Staff T, Kortum P, Miller J, Staff T. Determining what individual SUS scores mean: adding an adjective rating scale. Journal of Usability Studies. 2009;4(3): 114–123. [Google Scholar]
- 30.Sarkar U, Gourley GI, Lyles CR, et al. Usability of commercially available mobile applications for diverse patients. Journal of General Internal Medicine. 2016;1417–1426. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Ye Q, Boren SA, Khan U, Simoes E, Kim MS. Experience of diabetes self-management with mobile applications: a focus group study among older people with diabetes. European Journal for Person Centered Healthcare. 2018;6(2): 262–273. [Google Scholar]
- 32.Fu HN, Adam TJ, Konstan JA, Wolfson JA, Clancy TR, Wyman JF. Influence of patient characteristics and psychological needs on diabetes mobile app usability in adults with type 1 or type 2 diabetes: crossover randomized trial. JMIR Diabetes. 2019;4(2): e11462. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Martin C, Flood D, Sutton D, Aldea A, Harrison R, Waite M. A systematic evaluation of mobile applications for diabetes management. International Federation for Information Processing. 2011;466–469. [Google Scholar]
- 34.Garcia E, Martin C, Garcia A, Harrison R, Flood D. Systematic analysis of mobile diabetes management applications on different platforms. Symposium of the Workgroup Human-Computer Interaction and Usability Conference proceedings Austrian Computer Society Information Quality e-Health. 2011;379–396. [Google Scholar]
- 35.Caburnay CA, Graff K, Harris JK, et al. Evaluating diabetes mobile applications for health literate designs and functionality. Preventing Chronic Disease. 2015;12: E61. [DOI] [PMC free article] [PubMed] [Google Scholar]