Abstract
Mobile applications that facilitate each stage of the advance care planning process (i.e., obtaining knowledge, contemplating options, and acting on decisions) may be one effective way to support patient-centered care and patient autonomy. The purpose of the current review was to identify and evaluate advance care planning mobile applications for patients. Our specific aim was to examine app features, design quality, content, and readability. We searched the Apple iOS and Google Play stores using keywords developed in conjunction with an academic librarian. Two coders with expertise in palliative care applied guidelines from a previous review and used a consensus coding procedure. We also calculated a Flesh Reading Ease score for each app. Nine apps met criteria and could be evaluated. Overall, apps are limited in features and poor in terms of design quality, layout, and functionality. Regarding content, most apps emphasize making decisions or taking action about advance care planning: 6 apps permit users to document a preferred decision maker, and 6 apps offer a mechanism to distribute and share advance care planning documentation. Three apps focus on knowledge about advance care planning, and only 4 support contemplation about advance care planning. Apps range in terms of readability, from very difficult to fairly easy. This review identifies limitations in features, design quality, and content of existing advance care planning mobile apps. We present recommendations based on the results of this review for the development of future advance care planning apps.
Keywords: advance care planning, behavior change, decision making, mHealth, mobile applications, knowledge
Background
Advance care planning (ACP) is a pillar of patient-centered care.1 However, it is a complex process of behavior change2,3 that involves a series of steps, including obtaining information, contemplating options, and discussing—then documenting—plans for future care.4,5 Interventions and programs facilitate steps in the ACP process, including contemplation (e.g., The Conversation Project6) and documentation of values and care preferences (e.g., Five Wishes7). Other evidence-based programs (e.g., PREPARE for Your Care8,9) address multiple components of the process by providing knowledge and permitting electronic documentation of care preferences that can be shared with others. However, many of these interventions and programs are limited to paper and pencil or online formats and have not utilized the mobile health (mHealth) delivery system.
mHealth applications (i.e., apps that are downloadable on phones and tablets) are an additional method to support patients with ACP. Transdisciplinary research indicates that mHealth is feasible and acceptable in a range of patient populations,10,11 and that patients appreciate the autonomy granted by mHealth resources.12 Transitioning opportunities for ACP to mobile devices may increase completion rates13 by removing perceived obstacles, such as feelings of discomfort14 and concerns about accessibility of decisions across health care organizations.15 Thus, mobile apps that support ACP may empower patient participation in care. However, in order to leverage mobile technology for ACP, it is imperative that the technology supports users through each step of the ACP behavior change process,4,16 promotes inclusivity across a wide range of patients and laypersons,17 and uniformly disseminates content that is consistent with the consensus definition of ACP.16
There is limited information regarding the quality and content of existing ACP mobile apps. Greenle and colleagues17 descriptively summarized the basic features and specifications of palliative care apps for use by patients. However, the findings from this review did not address issues of usability, design, or ACP content. Additionally, van der Smissen and colleagues18 conducted a meta-analysis of ACP interventions, but did not include mobile app-based interventions. Given the limitations of previous work and the burgeoning development of end-of-life related mHealth apps, the goal of this review was to evaluate mobile apps for ACP in terms of their features, design quality, ACP content, and level of accessibility.
Methods
Mobile Application Selection
We searched the Apple app store and Google Play store during April 2020 and identified 118 apps using the following keywords, which were developed in conjunction with an academic librarian: advance care, advance care planning, advance care plan, advance directive, health care proxy, living will, do not resuscitate/DNR, do not intubate/DNI, power of attorney/POA, and durable power of attorney/dPOA. Basic data about the apps—including their names and app store descriptions—were extracted first in order to determine whether they met eligibility criteria for the review. Importantly, we only included apps that were specifically created and designed to be used for ACP, or to “support adults at any age or stage of health in understanding their personal values, life goals, and preferences regarding future medical care.”1,16 In order to be included, apps had to be available in English, publicly available for download in the United States, designed for patient/layperson use, and not have an annual fee. Apps were excluded if they were not available in English, were designed for use by providers only, did not reference a component of the definition of ACP16 in their description, required an annual fee, were specific to a non-U.S. country, or were a legal app not specific to ACP (Figure 1). For apps that met all eligibility criteria (n = 10), we extracted additional data, including year of release/update, cost, number of downloads, star rating, and developer information (Table 1). Any mobile app that also had an affiliated website was only evaluated based on content directly available on the app itself, not based on website content that was linked from the app. After beginning the coding process, it became evident that one app (Advance Directives) was not functional. We attempted to contact the developer 2 times but did not receive a reply. As a result, we include descriptive information about this app in Table 1 only, given that it could not be coded for content, design, or functionality.
Figure 1.
Flow diagram for app selection and screening procedure.
Table 1.
General App Specs and App Features.
Mobile App | ACP Tools | Advance Directives* | BIDMC Health Care | Contingency Plan—Personal | MedStar CR |
---|---|---|---|---|---|
General Specs | |||||
Interface | Both | iOS | Both | iOS | Both |
Release/Update | 2009 | 2015 | 2019 | 2017 | 2017 |
Star Rating | 5 | NER/NED | 5 | NER/NED | 4.8 |
Privacy | ✓ | 99 | ✓ | ||
Company | Nous Foundation Inc. | VJ Periyakoil (via Stanford Medicine) | Beth Israel Deaconess Medical Center, Inc. | Contingency Plan (Gabrielle McColl) | Medstar Health Inc |
Free to Download | ✓ | ✓ | ✓ | ✓ | ✓ |
Additional Costs | 99 | ✓ | |||
Website | ✓ | 99 | ✓ | ||
Features | |||||
Personal information NOT required | ✓ | – | ✓ | ||
Personalizes documents | – | ✓ | ✓ | ✓ | |
Tracks/reports user progress | – | ✓ | ✓ | ||
Contains links to external webpages | a | – | ✓ | ✓ | |
Provides feedback | – | ||||
Contains videos | ✓ | – | |||
Contains graphics | ✓ | – | |||
Mobile App | My Dot Mediq | My Directives | My Health Proxy | My Living Will | PaperHealth |
General Specs | |||||
Interface | iOS | iOS | Android | Android | iOS |
Release/Update | 2018 | 2019 | 2013 | 2014 | 2015 |
Star Rating | NER/NED | 4.7 | NER/NED | 2 | 5 |
Privacy | ✓ | ✓ | ✓ | ✓ | |
Company | My Med Link | MyDirectives.com | Plead Fast, LLC | Plead Fast, LLC | William Palin |
Free to Download | ✓ | ✓ | $4.99 | $4.99 | ✓ |
Additional Costs | |||||
Website | ✓ | ||||
Features | |||||
Personal information NOT required | ✓ | ✓ | ✓ | ||
Personalizes documents | ✓ | ✓ | ✓ | ✓ | ✓ |
Tracks/reports user progress | |||||
Contains links to external webpages | ✓ | ✓ | |||
Provides feedback | |||||
Contains videos | |||||
Contains graphics |
NER/NED = not enough ratings or not enough downloads.
a = downloads of videos redirect to an outside link on Android.
99 = unable to determine.
App features could not be rated because app was not functional at the time of the review.
Mobile Application Coding
The coding procedure was a modified version of the coding guidelines used by van der Smissen and colleagues18 in their review of ACP websites. Full descriptions of each item rated in the review are available in the Appendix.
Application features.
We defined features as app attributes that support user experience or content. Coders rated features related to user experience (e.g., Is the app accessible without registering/entering personal information? Does the app display information about progress?) and app content (e.g., presence of videos and graphics) using items adapted from the CONSORT-EHEALTH Group Checklist.19 Features were rated as absent or present.
Application quality.
Quality was defined in terms of user friendliness and aesthetic design. We measured quality using items from the Mobile App Rating Scale20 (i.e., MARS) functionality (4 items) and aesthetics (3 items) subscales. The MARS is designed to evaluate mHealth apps and includes functional assessment items such as, “How accurately/fast do the app functions and components work?” and aesthetic assessment items such as, “How high is the quality/resolution of graphics used for buttons/icons/menus/content?” We selected items from the functionality and aesthetics subscales only because the engagement subscale of the MARS was not appropriate for the subject (e.g., “Is the app entertaining to use?”) and items from the information subscale were not content-specific (see following section). Each quality item was rated on a 5-point scale from 1 (inadequate) to 5 (excellent) and provides detailed descriptions for each anchor corresponding to each specific item.
Application content.
We operationalized ACP content as app information that provides education about or scaffolds the ACP process. We adapted 20 items from the ACP Engagement Survey21 to evaluate content. The original, 84-item survey is comprised of a Process Measures section and an Action Measures section. For the current review, 14 items from the knowledge and contemplation subscales of the Process Measures section were used to measure whether each app “provided knowledge about” or “encouraged contemplation about” various ACP topics (e.g., who should/can be a medical decision maker). These items are usually rated on a 5-point Likert-type scale (e.g., “How well informed are you about who can be a medical decision maker?”) but were rephrased into a yes/no format (i.e., “Does the app provide information about who can be a medical decision maker?”) for the purpose of the app rating. We did not include items from the self-efficacy and readiness Process Measures subscales because those items reflect intraindividual processes (i.e., “feeling confident” and “feeling ready”) that cannot be objectively evaluated in a yes/no format via information on an app. We selected 6 additional items from the Action Measures section to rate each app in terms of its support of ACP decisions (e.g., choosing a medical decision maker, indicating what conditions make life not worth living). These items are meant to be rated using a yes/no format, so they were rephrased in order to correspond to the app rating procedure (e.g., “Does the app allow the user to choose and document a medical decision maker?”).
Application readability.
We evaluated average readability using the Flesch Reading Ease score22,23 for all apps with native content to the app. Scores are calculated using average sentence length and average number of syllables per word in the sentence. Scores range from 0 to 100, with higher numbers indicating greater ease of readability. Scores under 50 are very difficult or difficult to read (college level or higher), 50-60 are fairly difficult to read, 60-70 are easily understood by early high school age students, 70-90 are fairly easy to read, and 90-100 are very easy to read. We manually typed a sample of approximately 150 words of text from each app with text native to the app (i.e., that was not simply accessible via link or part of a fillable form embedded in the app) into an online readability calculator (https://readabilityformulas.com/free-readability-formula-tests.php) which uses a logarithm to measure and produce a rating for the Flesch Reading Ease score.
Consensus Coding Procedure
Two Masters or PhD level coders (authors MM and EK) with expertise in palliative care and ACP developed the coding guidelines over a series of several meetings. Next the 2 coders independently rated 2 apps using the guidelines then met in person to identify and discuss discrepancies. During the meeting, coders compared ratings, item by item, presenting rationale and evidence for each rating. At this time, guidelines for each coding element were defined in order to be applied moving forward. A consensus was reached for all ratings on the first 2 apps. Then, the coders rated another application independently following the same consensus coding procedure to discuss discrepancies and present rationale for ratings. Because no definitions were adjusted after the review of the third app, the coders reviewed the remaining apps in groups of 2 apps, then met to resolve discrepancies and establish a consensus rating for each item until all apps had been consensus coded by both raters. Apps that were available on both Apple iOS and Google Play stores were coded in both formats when significant discrepancies in app formatting was discovered.
Results
Application Features and Quality
A complete summary of results is available in Tables 1 to 3. Of the 9 apps that could be coded in full, 7 were free to download and use (Table 1). Generally, apps were limited in features: for example, only one app contained videos and graphics. Five apps contained links to external web pages with related content, though one of these links did not load, and one led to a site that was no longer functional. Eight apps attempted to offer some degree of personalization of documents (e.g., adding name to a DNR order), but one app did not accurately auto populate user information into the personalized document (Table 1).
Table 3.
Advance Care Planning Content.
Mobile App | ACP Tools | BIDMC Health Care | Contingency Plan—Personal | MedStar CR | My Dot Mediq | My Directives | My Health Proxy | My Living Will | PaperHealth |
---|---|---|---|---|---|---|---|---|---|
Knowledge | |||||||||
Describes who can be a medical decision maker | |||||||||
Describes qualities associated with being a good medical decision maker | ✓ | ||||||||
Describes types of decisions medical decision maker could make | ✓ | ✓ | ✓ | ||||||
Defines meaning of “flexibility” for medical decision makers | |||||||||
Provides information about what type of flexibility can be afforded to medical decision makers | |||||||||
Provides information about questions one can ask provider about decision-making | ✓ | ||||||||
Contemplation | |||||||||
Encourages user to think about medical decision maker | ✓ | ✓ | |||||||
Encourages user to ask someone to be medical decision maker | ✓ | ✓ | |||||||
Encourages user to think about whether certain health situations make life worth living | ✓ | ✓ | ✓ | ||||||
Encourages user to think about preferred care during sickness or near end of life | ✓ | ✓ | |||||||
Encourages user to think about how much flexibility they would give to medical decision maker | |||||||||
Encourages user to talk with medical decision maker about advance care planninga | ✓ | ✓ | |||||||
Encourages user to talk with medical provider about advance care planninga | ✓ | ||||||||
Encourages user to talk with family and friends about advance care planninga | ✓ | ✓ | |||||||
Decision/Action | |||||||||
User can document preferred medical decision maker | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||
Recommends distribution and sharing of advance care planning documentation | ✓ | ✓ | ✓ | ✓ | |||||
Provides mechanism for distribution and sharing of advance care planning documentation | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | 99 | ||
Allows user to decide and document health situations that make life not worth living | ✓ | ✓ | ✓ | ||||||
App permits user to decide and document preferences for medical care in sickness or near the end of life | ✓ | ✓ | ✓ | ✓ | |||||
App permits user to decide and document how much flexibility they would want to give to medical decision maker | ✓ | ✓ | ✓ |
99 = unable to determine.
Derived from Item 11 from the ACP Engagement Survey.21
Ratings of quality ranged both between and within apps. In terms of functionality, approximately half of apps scored a 2 on the performance item; however, more than half of apps earned a 4 or 5 in terms of ease of use, navigation, and gestural design (Table 2). There was also within-app variability in functionality, as demonstrated by ACP Tools, which differed significantly across iOS and Android (note 1) platforms, and by MyDirectives and MyHealthProxy, which both received the highest rating for ease of use but scored below average in terms of overall performance. For example, while the information present in the videos on the ACP Tools app were of high quality regarding content and production, they were very difficult to access within the app due to poor user design and app functionality. Specifically, videos were not native to the app and required several additional steps to download, find, and play the content on a mobile device. As another example, My HealthCareProxy appeared to have high functionality with easy fill-in-the-blank screens enabling users to specify the name of a designated healthcare proxy, but information did not save and output consistently, and there was limited ability to navigate within the app to correct or change the personalized data that a user enters.
Table 2.
App Quality and Readability.
Mobile App | ACP Tools (iOS) | ACP Tools, (Android) | BIDMC Health Care | Contingency Plan—Personal | MedStar CR | My Dot Mediq | My Directives | My Health Proxy | My Living Will | Paper Health |
---|---|---|---|---|---|---|---|---|---|---|
Functionality | ||||||||||
Performance | 4 | 2 | 2 | 5 | 2 | 4 | 2 | 2 | 4 | 2 |
Ease of use | 4 | 1 | 5 | 4 | 3 | 3 | 5 | 5 | 5 | 2 |
Navigation | 4 | 1 | 5 | 5 | 3 | 4 | 5 | 5 | 4 | 2 |
Gestural design | 4 | 2 | 5 | 4 | 4 | 4 | 5 | 5 | 4 | 2 |
Aesthetics | ||||||||||
Layout | 3 | 1 | 5 | 4 | 2 | 4 | 4 | 4 | 4 | 2 |
Graphics | 3 | 1 | 4 | 4 | 1 | 3 | 4 | 4 | 3 | 2 |
Visual appeal | 3 | 1 | 3 | 4 | 2 | 2 | 3 | 3 | 3 | 2 |
Readability | ||||||||||
Flesch Reading Ease score | 75a | 75a | 68 | 55 | b | b | b | 56 | 60 | 48 |
In terms of aesthetics, layout was generally in the average to good (i.e., 3-4) range, with 5 apps earning ratings of 4. Most apps scored in the poor to average range for visual appeal, with 8 apps earning ratings between 1-3. Use of graphics and diversity of graphics on the apps was mixed. Overall, most apps worked “well enough” but demonstrated little aesthetic appeal based on the MARS subscale criteria.
Application Content
App content in the domain of knowledge was largely absent. Six apps contained no knowledge-based content about ACP. Among the 3 apps with knowledge content, only one (ACP Tools) contained more than one key piece of information (Table 3). App content regarding contemplation about ACP was also sparse, with only 4 apps containing any contemplation-related content. The most commonly included content related to contemplation was when the app encouraged the user to think about health situations that make life worth living. No apps encouraged users to think about flexibility they might afford to a medical decision maker (Table 3). Apps contained the largest amount of content surrounding ACP decisions and actions. Six apps permitted users to document a preferred medical decision maker, and 6 apps offered a mechanism for the distribution and sharing of ACP documentation. All 9 apps that could be evaluated contained some content related to making decisions or taking action (Table 3).
Application Readability
Due to limited native text on several apps, we were only able to calculate reading ease scores for 6 apps. Overall, readability ranged, with one app scoring in the “very difficult to read” range, 2 in the “fairly difficult” range, 2 at the high school reading level, and one in the “fairly easy to read” range. We also identified misnomers of key ACP terminology on more than one app (e.g., advanced care planning; Table 2).
Discussion
The current review examined features, quality, content, and readability of mobile apps for ACP and exposes several problems with apps currently available in the marketplace. Importantly, ACP apps are limited in both quality and scope and are reserved for individuals ready to complete advance directives rather than those hoping to learn about the ACP process. Apps provide limited information about key components of ACP (e.g., understanding the role of the medical decision maker, flexibility that can be granted to the medical decision maker) and do not emphasize contemplation of ACP topics (e.g., deciding what conditions would make life not worth living).
Contrary to the field’s understanding of ACP as a process of behavior change,16 most apps in our review focus primarily on documenting a health care decision maker or documenting basic preferences for care (i.e., intubation, resuscitation). While these parts of the ACP process are essential—and possibly prioritized, because they result in a clear deliverable that supports future care decisions and preferences—they are generally considered to be successive to other steps involving communication and contemplation.4 Mobile app developers may have focused on this stage of the ACP process purposefully, expecting that conversations and contemplation about such important decisions happen “off” the app, in doctors’ offices or around dinner tables.24 Nevertheless, these assumptions suppose that users have had the opportunity to complete previous stages of the process offline, which is not always the case due to common barriers to ACP.25,26 More importantly, the incomplete range of ACP topics covered on mobile apps does a disservice to individuals in earlier stages of behavior change, who may look to these apps simply to learn more about ACP. Instead, across most apps, users are expected to jump directly into decision-making processes without necessarily understanding why these decisions matter, how much thought should precede them, or the significance associated with making such decisions.
Furthermore, given the range of available guidelines for mHealth technology development, the design of apps included in our review was unexpectedly poor. However, our findings are consistent with other mHealth research27 suggesting that app design quality and content are not perfectly correlated and therefore raise important questions about barriers (e.g., cost28) that may preclude apps from performing well in both domains. Among the problems we encountered were broken links, typos, fillable forms that did not fill correctly, inconsistent swipe functions across pages, and non-zoomable screens. Moreover, there was a general trend of inconsistent balance between quality and content: apps with high quality content (i.e., ACP Tools) had severe quality issues, and apps with excellent design features (i.e., Contingency Plan) were lacking in content. In ACP apps, content is useless if the user interface renders that content difficult to access, and aesthetics are negligible if content is absent. Given that no single app rated highly in design and content, we cannot recommend any of the apps evaluated in the current review for use.
Our review also exposed that several ACP mobile apps are not truly mobile apps, but instead, platforms with links to preexisting ACP websites. Rather than contain elements of ACP natively in the app, much of the content was accessible only on the associated website. This raises usability concerns, as apps that link to websites may not load when the user is not connected to the internet. Additionally, accessing a website from a mobile phone often results in a user experience that may not be optimized, as the screen is smaller, formatting is distorted, and features may not load properly. Given that older adults especially report greater variability in comfort with technology,29,30 usability must be prioritized in the development of future ACP apps. Furthermore, special consideration should be given at the inception of mobile app development to whether the ACP content of interest ought to be available as an app or, instead, if it is better suited for a website-only format. Despite the limitations associated with website-only programs (e.g., cannot be used offline), they may offer greater accessibility to certain target audiences and more financial flexibility to program developers.
Conclusions and Recommendations
Currently available ACP mobile apps have incomplete ACP content and lack essential design features that promote user friendliness. However, mobile apps are an important resource to enhance the patient experience and promote advance directive completion among laypeople. As a result, there is a strong need to develop rigorous, soundly designed apps grounded in the ACP research evidence base. The results of this review offer several recommendations for app developers when creating and launching ACP mobile apps (Table 4). Most importantly, all ACP mobile apps should provide (1) basic information about what ACP is, including that it is a process of behavior change that can be broken down into a series of steps, and (2) a description of what part of the ACP process the app is intended to support (e.g., providing information, supporting decision making). We also suggest that these guidelines be referenced in revisions of current ACP mobile apps—especially in the domains of quality and content—and that developers offer clear disclaimers about apps that are only intended to support a single part of the ACP process (e.g., making a decision without suggesting contemplation) so that users can decide up front whether the app will be useful based on their stage of the ACP process.
Table 4.
Guidelines for Mobile Apps for Advance Care Planning.
Domain | Key to include in mobile app |
---|---|
Features | • Provide a complete description of app “goals” for user (e.g., produce a deliverable, obtain knowledge) • Graphics that are easy to see, sensible for advance care planning content • Consistency in swipes/taps across pages • Consistency in video links across app versions (i.e., iOS, Android) • Saves user information that is entered during the first time the app is used • All content is available on the app (app limits links to websites) |
Quality | • Zoomable screen • Large font • Consistency of buttons and swipes throughout app |
Content | Knowledge • Provide user with information about app content and capabilities • Provides information about range of individuals who can be a health care proxy (e.g., partner, adult child, close friend) • Provides reasons why a person might be chosen to be a health care proxy (e.g., understands person’s values, wishes) • Definitions of common medical orders (i.e., DNR, DNI, POLST) and care options (i.e., palliative care, hospice care) for individuals with serious illness Contemplation • Encourages thought about choosing a health care proxy • Encourages thought about medical care values and preferences Decision/Action • Permits user to document a decision-maker • Permits user to share documented decisions with care partners, health care providers, and legal representatives |
Accessibility | • Eighth grade reading level or lower • Define key terms that may be outside the range of eighth grade reading level • Provide culturally appropriate synonyms for terms that may not otherwise be understood by certain diverse audiences • Offer text-to-speech option • Limit user cost to download and to store or share app information |
Acknowledgments
The authors would like to thank Tina Lu for her support with manuscript formatting.
Funding
The authors received no financial support for the research, authorship, and/or publication of this article.
Appendix
App Features.
Item | Criterion | Code |
---|---|---|
Personal Info | App is accessible without registering/entering personal information 0 = no 1 = yes 99 = cannot find/unsure |
|
Personalization of Documents | App can receive input to do things like answer questions or personalize documents 0 = no, user cannot enter information into the app at all (which would mean that it cannot be personalized), OR information entered by the user is not auto-populated into any documents/saved on the app 1 = yes, user can enter info into the app and that info is used to auto-populate documents/saved on the app 99 = cannot find/unsure |
|
Tracks and reports user progress | App displays information about progress 0 = no, does not tell user how much work is remaining, how much they have completed, how much they have left to complete 1 = yes, indicates to user how much work remains, how much they have already completed given what is available on the app 99 = cannot find/unsure |
|
Contains links to external webpages | App contains hyperlinks to external webpages 0 = no, no links to ANY external webpages or websites 1 = yes, includes links to the website that is associated with the app itself (e.g., MyDirectives is a good example—they link to the site so that you can fill out forms on the site which goes back into the app) 99 = cannot find/unsure |
|
Contains videos | App contains video content 0 = no 1 = yes 99 = cannot find/unsure |
|
Contains graphics | App contains graphics/infographics 0 = no 1 = yes 99 = cannot find/unsure |
App Quality.
Mars Subscale | Criterion | Code |
---|---|---|
Functionality | Performance: How accurately/fast do the app features (functions) and components (buttons/menus) work? 1 = App is broken; no/insufficient/inaccurate response (e.g. crashes/bugs/broken features, etc.) 2 = Some functions work, but lagging or contains major technical problems 3 = App works overall. Some technical problems need fixing/Slow at times 4 = Mostly functional with minor/negligible problems 5 = Perfect/timely response; no technical bugs found/contains a “loading time left” indicator |
|
Functionality | Ease of use: How easy is it to learn how to use the app; how clear are the menu labels/icons and instructions? 1 = No/limited instructions; menu labels/icons are confusing; complicated 2 = Useable after a lot of time/effort 3 = Useable after some time/effort 4 = Easy to learn how to use the app (or has clear instructions) 5 = Able to use app immediately; intuitive; simple |
|
Functionality | Navigation: Is moving between screens logical/accurate/appropriate/ uninterrupted; are all necessary screen links present? 1 = Different sections within the app seem logically disconnected and random/confusing/navigation is difficult 2 = Usable after a lot of time/effort 3 = Usable after some time/effort 4 = Easy to use or missing a negligible link 5 = Perfectly logical, easy, clear and intuitive screen flow throughout, or offers shortcuts |
|
Functionality | Gestural design: Are interactions (taps/swipes/pinches/scrolls) consistent and intuitive across all components/screens? 1 = Completely inconsistent/confusing 2 = Often inconsistent/confusing 3 = OK with some inconsistencies/confusing elements 4 = Mostly consistent/intuitive with negligible problems 5 = Perfectly consistent and intuitive |
|
Aesthetics | Layout: Is arrangement and size of buttons/icons/menus/content on the screen appropriate or zoomable if needed? 1 = Very bad design, cluttered, some options impossible to select/locate/see/read device display not optimized 2 = Bad design, random, unclear, some options difficult to select/locate/see/read 3 = Satisfactory, few problems with selecting/locating/seeing/reading items or with minor screen size problems 4 = Mostly clear, able to select/locate/see/read items 5 = Professional, simple, clear, orderly, logically organized, device display optimized. Every design component has a purpose |
|
Aesthetics | Graphics: How high is the quality/resolution of graphics used for buttons/icons/menus/content? 1 = Graphics appear amateur, very poor visual design—disproportionate, completely stylistically inconsistent 2 = Low quality/low resolution graphics; low quality visual design—disproportionate, stylistically inconsistent 3 = Moderate quality graphics and visual design (generally consistent in style) 4 = High quality/resolution graphics and visual design—mostly proportionate, stylistically consistent 5 = Very high quality/resolution graphics and visual design—proportionate, stylistically consistent throughout |
|
Aesthetics | Visual appeal: How good does the app look? 1 = No visual appeal, unpleasant to look at, poorly designed, clashing/mismatched colors 2 = Little visual appeal—poorly designed, bad use of color, visually boring 3 = Some visual appeal—average, neither pleasant, nor unpleasant 4 = High level of visual appeal—seamless graphics—consistent and professionally designed 5 = As above, very attractive, memorable, stands out; use of color enhances app features/menus |
App Content.
Measure/Subscale | Item (number21) | Criterion | Code |
---|---|---|---|
Process/Knowledge | Describes who can be a medical decision maker (1) | App provides information about who can be a medical decision maker 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Knowledge | Describes qualities associated with being a good medical decision maker (2) | App provides information about qualities associated with being a good medical decision maker 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Knowledge | Describes types of decisions medical decision maker could make (3) | App provides information about what types of decisions medical decision makers could be responsible for in the future 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Knowledge | Defines meaning of “flexibility” for medical decision makers (55) | App defines the meaning of “flexibility” for medical decision makers 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Knowledge | Provides information about what type of flexibility can be afforded to medical decision makers (56) | App provides information about what “flexibility” medical decision makers can be afforded 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Knowledge | Provides information about questions one can ask provider about decision-making (74) | App provides user with information about questions they can ask medical provider that will help with medical decision making 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Contemplation | Encourages user to think about medical decision maker (4) | App encourages user to think about their medical decision maker 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Contemplation | Encourages user to ask someone to be medical decision maker (5) | App encourages user to ask someone to be their medical decision maker 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Contemplation | Encourages user to think about whether certain health situations make life worth living (21) | App encourages user to think about whether certain health situations make life not worth living 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Contemplation | Encourages user to think about preferred care during sickness or near end of life (38) | App encourages user to think about preferred care during sickness or near end of life 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Contemplation | Encourages user to think about how much flexibility they would give to medical decision maker (57) | App encourages user to think about how much flexibility they would give to medical decision maker 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Contemplation | Encourages user to talk with medical decision maker about advance care planning (22) | App encourages user to talk with medical decision maker about advance care planning 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Contemplation | Encourages user to talk with medical provider about advance care planning (23) | App encourages user to talk with medical provider about advance care planning 0 = no 1 = yes 99 = cannot find/unsure |
|
Process/Contemplation | Encourages user to talk with family and friends about advance care planning (24) | App encourages user to talk with other family and friends about advance care planning 0 = no 1 = yes 99 = cannot find/unsure |
|
Action/Decision | User can document preferred medical decision maker (11) | App permits user to document preferred medical decision maker 0 = no 1 = yes 99 = cannot find/unsure |
|
Action/Decision | Recommends distribution and sharing of advance care planning documentation (11) | App recommends distribution/sharing of advance care planning documentation 0 = no 1 = yes 99 = cannot find/unsure |
|
Action/Decision | Provides mechanism for distribution and sharing of advance care planning documentation (11) | App provides mechanism for distribution/sharing of advance care planning documentation 0 = no 1 = yes 99 = cannot find/unsure |
|
Action/Decision | Allows user to decide and document health situations that make life not worth living (36) | App permits user to decide and document what certain health situations would make life not worth living 0 = no 1 = yes 99 = cannot find/unsure |
|
Action/Decision | App permits user to decide and document preferences for medical care in sickness or near the end of life (53) | App permits user to decide and document preferences for medical care in sickness or near the end of life 0 = no 1 = yes 99 = cannot find/unsure |
|
Action/Decision | App permits user to decide and document how much flexibility they would want to give to medical decision maker (64) | App permits user to decide and document how much flexibility they would want to give to medical decision maker if they have to make medical decisions for user 0 = no 1 = yes 99 = cannot find/unsure |
Footnotes
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
With the exception of ACP Tools, user interface across iOS and Android versions of all apps was similar. However, because the user interface and user experience differed for the coders so significantly for ACP Tools, we provided 2 sets of ratings for this app.
Reference
- 1.Institute of Medicine. Dying in America: Improving Quality and Honoring Individual Preferences Near the End of Life. Washington, DC: The National Academies Press; 2015. [PubMed] [Google Scholar]
- 2.Ernecoff NC, Keane CR, Albert SM. Health behavior change in advance care planning: an agent-based model. BMC Public Health. 2016;16(1):193. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Prochaska JO, DiClemente CC. Transtheoretical therapy: toward a more integrative model of change. Psychotherapy (Chic).1982; 19(3):276–288. [Google Scholar]
- 4.Fried TR, Bullock K, Iannone L, O’Leary JR. Understanding advance care planning as a process of health behavior change. J Am Geriatr Soc. 2009;57(9):1547–1555. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Sudore RL, Heyland DK, Barnes DE, et al. Measuring advance care planning: optimizing the advance care planning engagement survey. J Pain Symptom Manage. 2017;53(4):669–681. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Institute for Healthcare Improvement. Conversation Project website. Updated 2020. Accessed May 15, 2020. http://theconversationproject.org
- 7.Aging with Dignity. Five Wishes website. Updated 2020. Accessed May 15, 2020. https://fivewishes.org/Home.
- 8.Sudore RL, Knight SJ, McMahan RD, et al. A novel website to prepare diverse older adults for decision making and advance care planning: a pilot study. J Pain Symptom Manage. 2014;47(4):674–686. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Sudore RL, Boscardin J, Feuz MA, McMahan RD, Katen MT, Barnes DE. Effect of the PREPARE website vs an easy-to-read advance directive on advance care planning documentation and engagement among veterans: a randomized clinical trial. JAMA Intern Med. 2017;177(8):1102–1109. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Loh KP, Ramsdale E, Culakova E, et al. Novel mHealth app to deliver geriatric assessment-driven interventions for older adults with cancer: pilot feasibility and usability study. JMIR Cancer. 2018;4(2): e10296. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Lyons EJ, Swartz MC, Lewis ZH, Martinez E, Jennings K. Feasibility and acceptability of a wearable technology physical activity intervention with telephone counseling for mid-aged and older adults: a randomized controlled pilot trial. JMIR Mhealth Uhealth. 2017;5(3): e28. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Bellur S, DeVoss C. Apps and autonomy: perceived interactivity and autonomous regulation in mHealth applications. Commun Res Rep. 2018;35(4):314–324. [Google Scholar]
- 13.Yadav KN, Gabler NB, Cooney E, et al. Approximately one in three US adults completes any type of advance directive for end-of-life care. Health Aff (Millwood). 2017;36(7):1244–1251. [DOI] [PubMed] [Google Scholar]
- 14.De Vleminck A, Pardon K, Beernaert K, et al. Barriers to advance care planning in cancer, heart failure and dementia patients: a focus group study on general practitioners’ views and experiences. PLoS One. 2014;9(1): e84905. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Lund S, Richardson A, May C. Barriers to advance care planning at the end of life: an explanatory systematic review of implementation studies. PLoS One. 2015;10(2): e0116629. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Sudore RL, Lum HD, You JJ, et al. Defining advance care planning for adults: a consensus definition from a multidisciplinary Delphi panel. J Pain Symptom Manage. 2017;53(5):821–832. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Greenle MM, Morgan B, Sayani S, Meghani SH. Identifying mobile apps targeting palliative care patients and family members. J Palliat Med. 2018;21(10):1380–1385. [DOI] [PubMed] [Google Scholar]
- 18.van der Smissen D, Overbeek A, van Dulmen S, et al. The feasibility and effectiveness of web-based advance care planning programs: scoping review. J Med Internet Res. 2020;22(3): e15578. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Eysenbach G, CONSORT-EHEALTH Group. CONSORT-EHEALTH: improving and standardizing evaluation reports of Web-based and mobile health interventions. J Med Internet Res. 2011;13(4): e126. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Stoyanov SR, Hides L, Kavanagh DJ, et al. Mobile app rating scale: a new tool for assessing the quality of health mobile apps. JMIR mHealth uHealth. 2015;3(1): e27. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Sudore RL, Steward AL, Knight SJ, et al. Development and validation of a questionnaire to detect behavior change in multiple advance care planning behaviors. PLoS One. 2013;8(9): e72465. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Flesch R A new readability yardstick. J Appl Psychol. 1948; 32(3):221–233. [DOI] [PubMed] [Google Scholar]
- 23.Prabhu AV, Crihalmeanu T, Hansberry DR, et al. Online palliative care and oncology patient education resources through Google: do they meet national health literacy recommendations? Pract Radiat Oncol. 2017;7(5):306–310. [DOI] [PubMed] [Google Scholar]
- 24.Glover TL, Wolz A, Meiring AM, Resnick W, Clarizio AM, Baron-Lee J. Advance care planning: senior center members’ perceptions of a death over dinner event. J Palliat Med. 2018;21(7);892. [DOI] [PubMed] [Google Scholar]
- 25.Blackwood DH, Walker D, Mythen MG, Taylor RM, Vindrola-Padros C. Barriers to advance care planning with patients as perceived by nurses and other healthcare professionals: a systematic review. J Clin Nurs. 2019;28(23-24):4276–4297. [DOI] [PubMed] [Google Scholar]
- 26.Simon J, Porterfield P, Bouchal SR, Heyland D. ‘Not yet’ and ‘Just ask’: barriers and facilitators to advance care planning—a qualitative descriptive study of the perspectives of seriously ill, older patients and their families. BMJ Support Palliat Care. 2015;5(1):54–62. [DOI] [PubMed] [Google Scholar]
- 27.Jones C, O’Toole K, Jones K, Bremault-Phillips S. Quality of psychoeducational apps for military members with mild traumatic brain injury: an evaluation utilizing the mobile application rating scale. JMIR mHealth uHealth. 2020;8(8): e19807. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Zhang M, Cheow E, Ho CS, Ng BY, Ho R, Cheok CCS. Application of low-cost methodologies for mobile phone app development. JMIR mHealth uHealth. 2014;2(4): e55. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Gell NM, Rosenberg DE, Demiris G, LaCroix AZ, Patel KV. Patterns of technology use among older adults with and without disabilities. Gerontologist. 2015;55(3):412–421. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Lee C, Coughlin JF. PERSPECTIVE: Older adults’ adoption of technology: an integrated approach to identifying determinants and barriers. J Prod Innov Manage. 2014;32(5):747–759. [Google Scholar]