Abstract
Importance: Patient-reported outcome measures (PROMs) are used in rehabilitation to evaluate outcomes. We integrated a new PROM for transition-age youth with intellectual and/or developmental disabilities (IDD), the Pediatric Evaluation of Disability Inventory–Patient-Reported Outcome (PEDI–PRO), with a computer-delivered survey platform (Accessible Testing Learning and Assessment System) to enhance cognitive accessibility.
Objective: To evaluate the usability of the PEDI–PRO software and to investigate its reliability and acceptability to transition-age youth with IDD.
Design: Clinical field testing and a survey; repeated-observation test–retest design.
Setting: Clinicians evaluated the PEDI–PRO’s usability in school and health care contexts; research staff conducted reliability and acceptability testing in natural settings.
Participants: Occupational therapists (n = 12) and physical therapists (n = 2) administered the PEDI–PRO to 39 youths with IDD. Fifty-five transition-age youth with IDD (M age = 19.7) completed the PEDI–PRO twice.
Outcomes and Measures: Clinicians completed the System Usability Survey (SUS) and open-ended feedback. Youth provided feedback via a brief survey.
Results: The mean SUS rating was 84.00 (SD = 11.68), exceeding the industry standard. Intraclass correlations ranged from .80 to .83 across the three PEDI–PRO domains. Internal reliability (α) was .86–.90 across domains. Youth reported that they liked the accessibility features: interface images, button sounds, read-aloud audio, and rating category choices (M = 88.8%, SD = 5.1%).
Conclusions and Relevance: The PEDI–PRO supported transition-age youth with IDD to reliably report perceived functional performance. The accessible software was favorably perceived by both clinicians and youth.
What This Article Adds: Design features of the PEDI–PRO make it easy to use in practice with transition-age youth with IDD. The PEDI–PRO’s cognitively accessible administrative design, including step-by-step instructions for teaching PROM use and a self-reflective questioning technique, could serve as a training model for this and other PROMs.
Evaluating outcomes from the perspectives of clients with disabilities is one way to operationalize occupational therapy’s value for and commitment to client-centered practice. Patient-reported outcome measures (PROMs) are increasingly used at the population level with children and adults to evaluate health and rehabilitation outcomes. Given that 3.8 million children and youth with health needs, including those with intellectual and/or developmental disabilities (IDD),1 receive rehabilitation services such as occupational therapy and physical therapy (Children and Adolescent Health Initiative, 2017), evaluating the impact of rehabilitation services on children’s and youths’ everyday lives by means of PROMs is critical.
Most PROMs, however, are not accessible for youth with IDD. The nature of cognitive impairment varies widely in the population with IDD, from global impairments experienced by youth with intellectual disability (IQ < 70) to executive functioning impairments among youth with autism spectrum disorder. For youth with IDD, the cognitive demands required for self-reports pose a challenge. Youth must comprehend the question, evaluate their own experiences related to the question, and determine which response category optimally matches that evaluation. This process requires youth to execute cognitive processes involving attention, working memory, long-term memory, and judgment (Beddow, 2012; Fujiura, 2012). In addition, many youth with IDD and related cognitive impairments may have limited literacy skills (Conners, 2003). Most PROMs, which are predominantly paper and pencil based, therefore remain inaccessible to youth with IDD.
Technology is increasingly used to enhance the accessibility of educational tests (Beddow, 2012), but this vision has yet to be fully realized in the development of PROMs of function and participation for youth with IDD. Cognitive accessibility is present when PROM design anticipates variability in youths’ cognitive abilities and, to the greatest extent possible, reduces cognitive demands and supports cognitive processes to enable youth with a range of cognitive abilities to interpret and respond to assessment items as intended (Kramer & Schwartz, 2017a). Examples of cognitively accessible design strategies in technology include streamlined program flow, the use of multimedia to maximize content comprehension and independent operation, and error minimization techniques (e.g., auditory feedback, confirmation buttons; Davies et al., 2017). The Accessible Testing Learning and Assessment System (ATLAS) is a proprietary survey platform designed by AbleLink Smart Living Technologies (Colorado Springs, CO). The platform incorporates cognitive accessibility design strategies and has the capacity to administer questions and Likert responses. There is evidence that when adults with IDD use ATLAS, they have significantly fewer errors and shorter assessment times than with a traditional written assessment format (Davies et al., 2017; Stock et al., 2004). Adults with IDD have also independently used the ATLAS system to report satisfaction with medical appointments (White et al., 2015). In this study, we built on the ATLAS survey engine to design a PROM software system for youth with IDD.
Description of the Software System
The Pediatric Evaluation of Disability Inventory–Patient-Reported Outcome (PEDI–PRO) is a PROM designed for transition-age youth with IDD ages 14–22 yr. It has three domains—Daily Activities (23 items), Social/Cognitive (18 items), and Mobility (16 items)—that assess the functional performance of discrete tasks required to participate in three everyday life situations: getting ready in the morning, going to a restaurant, and working at a job. Youth use a 3-point response scale to report whether tasks are very easy, a little easy, or a little hard. Participation in these everyday life situations is crucial for youth with IDD who are preparing for transition to adulthood. Items and the response scale were developed with direct input from a team of youth collaborators, other transition-age youth, and clinicians (Kramer & Schwartz, 2018). Previous research also established that youth with IDD interpreted the initial PEDI–PRO items and response scale in the intended manner (Kramer & Schwartz, 2017b). However, additional research is needed to evaluate the usability and accessibility of these items when administered on an accessible PROM software platform; hence, we designed this study.
PEDI–PRO and Cognitive Accessibility
We previously described (Kramer & Schwartz, 2017a) an evidence-based and theoretically driven framework for operationalizing cognitive accessibility in PROM design, and we followed this framework to design the three components of the PEDI–PRO software: content, layout, and administration procedures (Beddow et al., 2010; Dolan et al., 2013; Kramer & Schwartz, 2017a).
Content is accessible when youth with IDD interpret PROM items in the intended manner. PROM design must comprehensively consider the semantic and pragmatic components of language that convey meaning. The PEDI–PRO facilitates content accessibility by referencing specific locations or everyday life situations when asking about function; using familiar, short words (e.g., talk and listen instead of communicate); and using simple sentence structure and verbs. The PEDI–PRO also enhances cognitive accessibility by including a realistic three-dimensionally rendered image with each item. Images may facilitate accurate interpretation of items when they depict essential visual information necessary for comprehension (Centers for Disease Control and Prevention, 2009; Mencap, 2000). We designed the image specifications described in Table 1 to facilitate accurate item interpretation.
Table 1.
Image Specifications for the PEDI–PRO
| Image Feature | Specification | Example Images | |
| People |
|
Wait for my turn to talk to the waiter or waitress. |
Type on a computer
keyboard. |
| Body positioning | • Depict functional trunk, limb, hand, foot, and head and neck postures while sitting, standing, walking, and manipulating and carrying objects. |
Open a taped box with scissors. |
Walk up stairs to the
next floor. |
| Background and props |
|
Slide into a booth. |
Put books, videos,
papers, or files in alphabetical order. |
Note. PEDI–PRO = Pediatric Evaluation of Disability Inventory–Patient Reported Outcome.
Reading and navigating a PROM involves perceptual, visual–motor, and short-term memory demands (Harniss et al., 2007; Kramer & Schwartz, 2017a). The PEDI–PRO’s layout reduces these visual perception demands by presenting only one item per screen, placing item text directly above the corresponding image, and placing response choices adjacent to each item. The PEDI–PRO also enhances accessibility by using a sans serif font with a regular typeface (no italics), maintaining more than 35% white space on the screen, and maximizing visual contrast (Dolan et al., 2013; Friedman & Bryen, 2007; Kramer & Schwartz, 2017a; Mencap, 2000).
Designing administration procedures that can adapt to each youth’s preferences and needs can enhance motivation and attention during PROM completion, subsequently improving the quality of the responses (Dolan et al., 2013; Friedman & Bryen, 2007; Kramer & Schwartz, 2017a). PROM administration procedures must also anticipate youth respondents who have varying sensory, motor, or communication abilities. The PEDI–PRO software operates on both desktop and mobile platforms using standard operating systems (e.g., Apple, Microsoft), which allows clinicians to incorporate youths’ preferred accessories or adaptive devices (e.g., mouse, touch stylus, eye control software) as needed. The software automatically reads all text and directions to reduce literacy and visual demands. It supports motivation by incorporating encouraging phrases throughout administration and reminds youth that there is no “right” answer. The PEDI–PRO also provides examples and practice questions to teach youth how to use the software and response categories as independently as possible. Finally, administration includes the use of think-aloud questions to enhance youth reflection and recall and ensure youth are using the PEDI–PRO in the intended manner. Three types of questions are used during administration:
What is the item about? (“How do you [item]?”)
How do youth perform the activity? (“Is it hard or easy to [item]? Why?”)
What response matches the youth’s performance? (“Tell me why you picked [selected response].”)
If youths’ responses suggest that they do not understand how to match PEDI–PRO response categories to their performance of the assessed task, the example and practice sections are readministered.
Study Aims
In this study, we aimed to evaluate clinicians’ perspectives of the PEDI–PRO software’s usability in practice environments and to establish the reliability and acceptability of the PEDI–PRO software. We hypothesized that the PEDI–PRO software’s cognitively accessible design would support its usability in practice and facilitate reliable responses by youth with IDD.
Method
Design
We used a two-part design. First, we conducted a preliminary evaluation of the clinical usability of the PEDI–PRO software in practice environments with clinical field testing and a survey. Second, we used a repeated-observation test–retest design with controlled administration to evaluate the reliability of youths’ responses when using the PEDI–PRO software. All study procedures were reviewed and approved by the Boston University institutional review board before recruitment and data collection.
System Usability Survey
Participants
We recruited clinicians through professional networks, social media, and a database of participants from a previous study of the use of youth PROMs (Kramer & Schwartz, 2018). The inclusion criteria were as follows: occupational or physical therapist, able to administer the prototype to three English-speaking young adults with IDD who were ages 14–22 yr. A total of 14 clinicians participated and represented different disciplines and practice environments (Table 2). The clinicians administered the PEDI–PRO to 39 youth with IDD (Table 3).
Table 2.
Characteristics of Field Clinicians Completing the System Usability Survey (n = 14)
| Characteristic | n (%) |
| Professional background | |
| Occupational therapist | 12 (85.7) |
| Physical therapist | 2 (14.3) |
| Clinical practice setting | |
| School | 9 (64.3) |
| Outpatient rehabilitation | 4 (28.6) |
| Other | 1 (7.1) |
| Years of practice experience, M (SD) | 16.3 (11.0) |
| Gender identitya | |
| Female | 11 (78.6) |
| Male | 2 (14.3) |
| Racial identitya | |
| White | 13 (92.9) |
| Non-White | 0 |
Data not reported by 1 participant; percentages calculated out of 14.
Table 3.
Characteristics of Youth (n = 39) Administered the PEDI–PRO by Field Clinicians
| Characteristic | n (%) |
| Age group, yr | |
| 14–16 | 21 (53.8) |
| 17–18 | 12 (30.8) |
| 19–22 | 6 (15.4) |
| Developmental disabilitya | |
| Autism spectrum disorder | 18 (46.2) |
| Down syndrome | 4 (10.3) |
| Cerebral palsy | 3 (7.7) |
| Intellectual disability | 14 (35.9) |
| Other developmental disability | 8 (20.5) |
| IQ group | |
| 40–49 | 12 (30.8) |
| 50–59 | 10 (25.6) |
| 60–70 | 17 (43.6) |
| Racial identity | |
| White | 24 (61.5) |
| African American or Black | 1 (2.6) |
| Asian | 2 (5.1) |
| Hispanic | 8 (20.5) |
| Multiple races | 3 (7.7) |
| Other racial identity | 1 (2.6) |
Note. PEDI–PRO = Pediatric Evaluation of Disability Inventory–Patient Reported Outcome.
Clinicians reported multiple disability diagnoses for some youth.
Procedures
Clinicians downloaded a beta version of the PEDI–PRO software and were provided with standardized administration directions and a brief, 5-min informational video. Clinicians administered the PEDI–PRO to three youth with IDD and then completed an online survey. Limiting the administrations to three youth ensured that the usability evaluation would capture the process of learning to use the software (McLellan et al., 2012). Clinicians could choose to administer the PEDI–PRO to any client ages 14–22 yr with IDD.
The System Usability Survey (SUS; Bangor et al., 2008) was used to evaluate usability. The SUS is a 10-item survey and an industry standard in software development. Clinicians indicate the degree to which they disagree or agree with each statement on a scale ranging from 1 (strongly disagree) to 5 (strongly agree). The survey also includes two open-ended questions regarding usability: “Describe what would make it hard to adopt the PEDI–PRO and what would make it easier” and “Provide additional comments about the usability of the PEDI–PRO.” Clinicians received an honorarium after completing the survey.
Analysis
The SUS scoring algorithm is used to transform reverse-scored items so a standard score can be calculated (Usability.gov, n.d.). Scores range from 0 to 100, with higher scores indicating greater usability; the accepted industry standard for usability systems is a mean score of 68 or higher (Bangor et al., 2008). We content-coded open-ended feedback as a strength or challenge related to content, layout, or administrative accessibility.
Interface Reliability and Acceptability
Participants
We recruited youth with IDD ages 14–22 yr from a large metropolitan area in New England. Inclusion criteria were as follows: youth with IDD ages 14–22; impaired intellectual functioning as indicated by an IQ of less than 85, executive function impairments secondary to IDD, or both; and English speaking. A total of 55 youth completed the repeated (test–retest) observations (Table 4).
Table 4.
Test–Retest Participants (n = 55)
| Characteristic | n (%) |
| Age group, yr, M (SD) | 19.7 (1.7) |
| Gender identity | |
| Female | 22 (40.0) |
| Male | 33 (60.0) |
| Developmental disabilitya | |
| Autism spectrum disorder | 17 (30.9) |
| Down syndrome | 7 (12.7) |
| Cerebral palsy | 4 (7.3) |
| Other IDD | 27 (49.1) |
| IQ range | |
| 30–39 | 1 (1.8) |
| 40–49 | 6 (10.9) |
| 50–59 | 9 (16.4) |
| 60–70 | 13 (23.6) |
| 71–85 | 10 (18.2) |
| >85 | 10 (18.2) |
| IQ not availablea | 6 (10.9) |
| Ethnic identityb | |
| Hispanic or Latinx | 4 (7.3) |
| Not Hispanic or Latinx | 48 (87.3) |
| Racial identityc | |
| White | 35 (63.6) |
| Asian | 8 (14.5) |
| African American or Black | 4 (7.3) |
| Multiple races | 3 (5.5) |
| Other racial identity | 4 (7.3) |
Note. Percentages may not total 100 because of rounding. IDD = intellectual and/or developmental disabilities.
Percentile scores were provided for 3 youths; all were <10% across domains. bData missing for 3 youths; percentages do not add to 100%. cData missing for 1 youth; percentages do not add to 100%.
Procedures
Trained assessors (six staff—two registered occupational therapists, three occupational therapy doctoral students, and one undergraduate student) administered the PEDI–PRO following the standardized administration directions. The PEDI–PRO was administered in familiar, low-distraction environments, such as classrooms, libraries, and school offices. Youth first completed the PEDI–PRO’s standardized teaching and example items. Specific to this study, the three everyday life situations were next administered in a random order. Youth could take a break as needed between each everyday life situation. This process was repeated 1–2 wk later; the administration order of everyday life situations was counterbalanced across Observations 1 and 2. At the end of the second administration, youth completed a survey about specific features of the PEDI–PRO (ratings of “like” or “don’t like”) and provided open-ended feedback. The survey was designed in collaboration with the team of youth with IDD who partnered in the PEDI–PRO’s development (Kramer & Schwartz, 2018) and was designed to be accessible. One youth did not complete the second session because of illness; the reported results are for those who completed both sessions. Youth received an honorarium for their participation.
Analysis
Item responses were organized into Daily Activities, Social/Cognitive, and Mobility domains. We used two methods to evaluate response reliability. First, we calculated the percentage of exact agreement for each youth, that is, the percentage of items for which the youth selected the same response category across both observations. We calculated the mean percentage of exact agreement for each domain in two ways: (1) for all respondents and (2) for all respondents with more than 50% exact agreement. Second, we calculated the consistency of domain scores obtained across observations. We used a Rasch rating scale model to obtain youth interval-level scores from the original Likert response scale (Tennant & Conaghan, 2007). Rasch item difficulty estimates were first calculated separately for Observation 1 and Observation 2; only one item in the Daily Activities domain had significantly different difficulty estimates across observations; it was removed from the analysis. We therefore used Observation 1 item difficulty estimates to calculate youth domain scores for both observations on each domain. An intraclass correlation coefficient (ICC) absolute agreement model (3,1) was used to investigate the reliability of domain scores across testing occasions; ICCs of .75 or higher are considered reliable (Koo & Li, 2016).
Results
System Usability Survey
The mean SUS rating was 84.00 (SD = 11.68), which exceeds the industry standard of 68 (Bangor et al., 2008). A median rating of strongly agree was provided for the positively worded SUS questions “I thought the PEDI–PRO was easy to use” and “I would imagine that most people would learn to use the PEDI–PRO very quickly.” A median rating of strongly disagree was provided for the negatively worded SUS questions “I found the PEDI–PRO unnecessarily complex,” “I think that I would need the support of a technical person to be able to use the PEDI–PRO,” “I thought there was too much inconsistency in the PEDI–PRO,” “I found the PEDI–PRO very cumbersome to use,” and “I needed to learn a lot of things before I could get going with the PEDI–PRO.”
Clinicians reported that youth liked the images and that the images supported comprehension (e.g., content accessibility). Only one clinician reported that the images were “not realistic at all” (ID 13). The most frequently reported content accessibility challenge occurred when items referred to a functional task that was unfamiliar to an individual youth. For example, clinicians reported that some youth did not use cellphones or debit cards or did not complete specific types of work tasks (e.g., carrying heavy boxes). In these instances, clinicians observed that youth had a difficult time selecting the appropriate response category.
Clinicians reported that the PEDI–PRO navigation buttons (e.g., response choices, “next” button) were “super easy to use” (ID 10). Clinicians generally observed that youth navigated the software independently. However, numerous clinicians reported that they had to slow youth down to avoid their moving too quickly between items.
Clinician feedback was overwhelmingly positive regarding the adaptability of the PEDI–PRO to individual user needs. Several clinicians indicated that the availability of the software across platforms (e.g., Apple/iPad and Microsoft/desktop) enhanced the usability. The automatic text to speech was well received by youth; one clinician explained, “Clients could understand the auditory instructions and [were] able to use a mouse to choose the button” (ID 27). Youth could also choose to input their responses using a variety of methods: touchscreen, mouse, or instructing the administrator to select the response. A majority of clinicians reported that youth were highly motivated to respond on the tablet or computer.
Several clinicians took advantage of flexibility allowed in the PEDI–PRO administration protocol to provide extra support and time. Clinicians shared, “It took extra time and verbal cueing to make sure they were picking the best response” (ID 29) and “The young adults did not seem to mind going through the questions, but doing all three sections at one sitting was tiresome. So being able to take breaks is a nice option” (ID 15). One clinician commented that she needed to provide support during administration “because of a lack of confidence—they are not used to making choices” (ID 13).
Interface Reliability and Acceptability
The test–retest reliability of domain scores, when considering the 95% confidence interval (CI), ranged from moderate to good (Koo & Li, 2016): Daily Activities, ICC3, 1 = .81, 95% CI [.68, .89]; Social/Cognitive, ICC3, 1 = .83, 95% CI [.67, .91]; and Mobility, ICC3, 1 = .80, 95% CI [.68, .88]. The mean percentage of exact agreement in each domain showed that the majority of youth consistently selected the same response to each item across observations: Daily Activities, M = 78.8%, SD = 14.2%; Social/Cognitive, M = 66.5%, SD = 18.3%; and Mobility, M = 69.9%, SD = 21.7%. We observed 3, 9, and 8 youths with percentage of exact agreement scores of less than 50% in the Daily Activities, Social/Cognitive, and Mobility domains, respectively. Seven youths had less than 50% agreement in only one domain; these youths ranged in age (15.4–21.5 yr), IQ score (45–80), and disability (autism spectrum disorder, n = 3; Down syndrome, n = 2; cerebral palsy, n = 1; general IDD, n = 1). Two youths had less than 50% agreement across all three domains, and 5 youths had less than 50% agreement across two domains. Of these youths, 1 had an IQ score in the 30–39 range, and 2 had IQ scores in the fourth percentile or lower (2 youths did not have IQ information).
Almost all youth reported that they liked the following PEDI–PRO features: interface images, button sounds, item read-aloud audio, and rating category choices (range of “like” ratings = 84%–98%; M = 88.8%, SD = 5.1%). Youths’ open-ended feedback indicated the videos and pictures supported their understanding and “show[ed] you [what] you want to do” (ID T338); some youth suggested using higher quality computer animation or images of real people. The audio was also well received because it made “it easy in case you don’t like reading” (ID T349). Most youth shared that they liked the rating category choices; some youth reported it could be difficult to select a response category, and others thought the choices were easy to use. Several youth suggested adding an additional fourth response choice of “very hard.”
Discussion
Our findings provide initial support for our hypothesis that a cognitively accessible PEDI–PRO software design would enhance usability in practice and facilitate reliable responses by youth with IDD. The SUS ratings far exceeded industry standards for usability, with clinicians reporting median ratings of strongly agree for the PEDI–PRO’s overall ease of use and the ability to learn how to use it quickly. Our reliability results provide initial evidence that the PEDI–PRO produces stable scores: the Daily Activities, Social/Cognitive, and Mobility domain scores had good reliability that matched or exceeded the reliability reported for other self-reports for adults with IDD (Bredemeier et al., 2014; McGillivray et al., 2009; Stancliffe et al., 2014). In combination, these results suggest that the PEDI–PRO’s operationalization of cognitively accessible content, layout, and administrative design features result in a PROM that can feasibly be used in rehabilitation practice with transition-age youth with IDD.
The ultimate goal of a PROM is to incorporate the direct perspectives of youth with IDD into intervention planning and rehabilitation outcome assessment. Even with the cognitively accessible PEDI–PRO software, clinicians’ feedback suggests that self-evaluation can be challenging for some youth with IDD. Clinicians reported that youth could most easily self-evaluate when PEDI–PRO items assessed familiar and relevant activities. These findings point to the need for robust item pools (e.g., set of items) that can be individualized to a youth’s age, experiences, and interests. Delivering PROMs via a software makes this more feasible than with traditional paper-and-pencil assessment formats. A user interface that allows users to preselect relevant content at the participation level (e.g., employment, school), activity level (e.g., does the youth use a debit card?), or body structure and function level (e.g., does the youth use a mobility device?) could provide optimal individualization and thus enhance not only accessibility but also validity of PROM results. The PEDI–PRO also provides a “skip” option to allow youth to skip questions that are not applicable, but youth may require additional training to understand how and when to use this response. Future PEDI–PRO research should incorporate expanded item pools and administrative interfaces that allow for this type of individualization.
To further explore the utility of the PEDI–PRO as an intervention planning tool, we used an additional indicator of reliability, percentage of exact agreement for responses to each item. Clinicians may use youth responses to individual PEDI–PRO items to identify goals for intervention and evaluate progress. The PEDI–PRO functional domains had 72.1%–80.8% exact agreement across observations. We did observe a pattern in which 3 youths with poor agreement across multiple domains were in the lowest IQ range. However, we note that our sample included other youth with low IQ scores who had exact agreement above 50% (7 youths with an IQ of 30–49; 9 youths with an IQ of 50–59). Thus, these findings provide further support that with cognitively accessible PROM design, many youth with IDD can provide consistent evaluations of their performance.
Limitations and Future Research
This study used a small convenience sample. As such, we were not able to analyze relationships between reliability and youth characteristics such as age and IQ. Clinicians may not have provided in-depth feedback about all features of the PEDI–PRO as a result of our broad, open-ended questions, and we did not collect data about each clinician’s familiarity and comfort with technology. Future research will examine the reliability of the PEDI–PRO with a larger sample and additional items. Future work will be necessary to identify how to select and deliver individualized item sets in a feasible and standardized manner using computer technology and item response theory approaches.
Implications for Occupational Therapy Practice
When clinicians consider using PROMs in practice with youth with IDD, they are faced with instruments that may not be accessible by youth with cognitive impairments (Schwartz et al., 2018). When clinicians have PROMs that are inaccessible, they express uncertainty about whether modifications to enhance accessibility will have a negative impact on the assessment’s psychometric properties (Kramer et al., 2012). In contrast, the clinicians and youth in this study reported that, although they encountered challenges, youth with a range of abilities could use the PEDI–PRO. The findings of this study, although specific to the PEDI–PRO, highlight the promise of accessible PROM software in rehabilitation practice. Notably, the results of this study have the following implications for occupational therapy practice:
The PEDI–PRO’s cognitively accessible administrative design includes step-by-step instructions for teaching PROM use and a self-reflective questioning technique that was feasibly implemented in practice. This design could serve as a training model for this and other PROMs.
Across clinicians and youth, the most notable accessibility design features of the PEDI–PRO were the use of images to support item comprehension, intuitive navigational structure, and automatic text to speech. Other assessment developers may find that incorporating these features will enhance the usability and reliability of PROMs.
Conclusion
Our results provide initial evidence that the PEDI–PRO software is easy to use in clinical practice, and the interface can be used in a reliable manner by transition-age youth with IDD. The PEDI–PRO, with its cognitive accessibility features, appears to support reliable reporting of daily activities, social–cognitive, and mobility function by youth with IDD, in contrast to many paper-and-pencil assessment tools that have historically been used with this population. Therefore, inclusion of cognitive accessibility design features may be critical for the adoption and effective utilization of PROMs for people with IDD.
Acknowledgments
We thank the Pediatric Evaluation of Disability Inventory–Patient-Reported Outcome (PEDI–PRO) Youth Team who partnered with us in the design of the PEDI–PRO: Alice, Brendan Durkin, Jacob, Mariana Vetoulis-Acevedo, and Sierra Wheaton-Williams (names used with permission). We also thank the clinicians and youth who generously gave their time to participate in this study. We recognize the valuable contributions of research assistants Kimberly Greenberg, Brice Hounshel, Tara Loeper, and Reed Kotz. The research reported in this article was supported by the Eunice Kennedy Shriver National Institute of Child Health and Human Development of the National Institutes of Health (Grant R41HD090772). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. AbleLink will pursue commercialization opportunities for the PEDI–PRO after the research phases.
Footnotes
In this article, we use the term IDD to refer to any type of disability with onset before age 22 yr that leads to an individual needing support in three or more major life areas, as defined by the Developmental Disabilities Assistance and Bill of Rights Act of 2000 (Pub. L. 106-402).
Contributor Information
Jessica M. Kramer, Jessica M. Kramer, PhD, OTR/L, is Associate Professor, Department of Occupational Therapy, University of Florida, Gainesville. At the time of the study, Kramer was Associate Professor, Department of Occupational Therapy, Boston University, Boston, MA; jessica.kramer@phhp.ufl.edu
Ariel E. Schwartz, Ariel E. Schwartz, PhD, OTR/L, is Postdoctoral Fellow, Center for Psychiatric Rehabilitation, Boston University, Boston, MA.
Daniel K. Davies, Daniel K. Davies, MA, is Founder and President, AbleLink Smart Living Technologies, Colorado Springs, CO.
Steven E. Stock, Steven E. Stock, MPA, is Vice President, AbleLink Smart Living Technologies, Colorado Springs, CO.
Pengsheng Ni, Pengsheng Ni, MD, MPH, is Research Associate Professor, Biostatistics and Epidemiology Data Analytics Center, Department of Health Law, Policy and Management, Boston University, Boston, MA..
References
- Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the System Usability Scale. International Journal of Human-Computer Interaction, 24, 574–594. 10.1080/10447310802205776 [DOI] [Google Scholar]
- Beddow, P. A. (2012). Accessibility theory for enhancing the validity of test results for students with special needs. International Journal of Disability Development and Education, 59, 97–111. 10.1080/1034912X.2012.654966 [DOI] [Google Scholar]
- Beddow, P., Elliot, S., & Kettler, R. (2010). Test Accessibility and Modification Inventory (TAMI) accessibility rating matrix: Technical manual. Vanderbilt University. [Google Scholar]
- Bredemeier, J., Wagner, G. P., Agranonik, M., Perez, T. S., & Fleck, M. P. (2014). The World Health Organization Quality of Life instrument for people with intellectual and physical disabilities (WHOQOL-Dis): Evidence of validity of the Brazilian version. BMC Public Health, 14, Article 538. 10.1186/1471-2458-14-538 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Centers for Disease Control and Prevention. (2009). Simply put: A guide for creating easy-to-understand materials. [Google Scholar]
- Children and Adolescent Health Initiative. (2017). Indicator 4.11: Is this child currently receiving special services to meet his or her developmental needs such as speech, occupational, or behavioral therapy? https://www.childhealthdata.org/browse/survey/results?q=6624&r=1&g=695
- Conners, F. (2003). Reading skills and cognitive abilities of individuals with mental retardation. International Review of Research in Mental Retardation, 27, 191–229. 10.1016/S0074-7750(03)27006-3 [DOI] [Google Scholar]
- Davies, D., Stock, S., King, L., Wehmeyer, M., & Shogren, K. (2017). An accessible testing, learning and assessment system for people with intellectual disability. International Journal of Developmental Disabilities, 63, 204–210. 10.1080/20473869.2017.1294313 [DOI] [Google Scholar]
- Developmental Disabilities Assistance and Bill of Rights Act of 2000, Pub. L. 106-402, 42 U.S.C. 15001 et seq. (2000).
- Dolan, R. P., Burling, K. S., Rose, D., Beck, R., Murray, E., & Strangman, N., . . . Hanna, E. (2013). Universal design for computer-based testing (UD-CBT) guidelines. Pearson Assessments. [Google Scholar]
- Friedman, M. G., & Bryen, D. N. (2007). Web accessibility design recommendations for people with cognitive disabilities. Technology and Disability, 19, 205–212. 10.3233/TAD-2007-19406 [DOI] [Google Scholar]
- Fujiura, G. T.; RRTC Expert Panel on Health Measurement. (2012). Self-reported health of people with intellectual disability. Intellectual and Developmental Disabilities, 50, 352–369. 10.1352/1934-9556-50.4.352 [DOI] [PubMed] [Google Scholar]
- Harniss, M., Amtmann, D., Cook, D., & Johnson, K. (2007). Considerations for developing interfaces for collecting patient-reported outcomes that allow the inclusion of individuals with disabilities. Medical Care, 45(Suppl. 1), S48–S54. 10.1097/01.mlr.0000250822.41093.ca [DOI] [PMC free article] [PubMed] [Google Scholar]
- Koo, T. K., & Li, M. Y. (2016). A guideline of selecting and reporting intraclass correlation coefficients for reliability research. Journal of Chiropractic Medicine, 15, 155–163. 10.1016/j.jcm.2016.02.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kramer, J. M., Heckmann, S., & Bell-Walker, M. (2012). Accommodations and therapeutic techniques used during the administration of the Child Occupational Self Assessment. British Journal of Occupational Therapy, 75, 495–502. 10.4276/030802212X13522194759851 [DOI] [Google Scholar]
- Kramer, J. M., & Schwartz, A. (2017a). Reducing barriers to patient-reported outcome measures for people with cognitive impairments. Archives of Physical Medicine and Rehabilitation, 98, 1705–1715. 10.1016/j.apmr.2017.03.011 [DOI] [PubMed] [Google Scholar]
- Kramer, J. M., & Schwartz, A. (2017b). Refining the Pediatric Evaluation of Disability Inventory–Patient-Reported Outcome (PEDI–PRO) item candidates: Interpretation of a self-reported outcome measure of functional performance by young people with neurodevelopmental disabilities. Developmental Medicine and Child Neurology, 59, 1083–1088. 10.1111/dmcn.13482 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kramer, J. M., & Schwartz, A. E. (2018). Development of the Pediatric Disability Inventory-Patient Reported Outcome (PEDI–PRO) measurement conceptual framework and item candidates. Scandinavian Journal of Occupational Therapy, 25, 335–346. 10.1080/11038128.2018.1502344 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McGillivray, J. A., Lau, A. L. D., Cummins, R. A., & Davey, G. (2009). The utility of the Personal Wellbeing Index Intellectual Disability scale in an Australian sample. Journal of Applied Research in Intellectual Disabilities, 22, 276–286. 10.1111/j.1468-3148.2008.00460.x [DOI] [Google Scholar]
- McLellan, S., Muddimer, A., & Peres, S. C. (2012). The effect of experience on system usability scale ratings. Journal of Usability Studies, 7, 56–67. http://uxpajournal.org/the-effect-of-experience-on-system-usability-scale-ratings/ [Google Scholar]
- Mencap. (2000). Am I making myself clear? Mencap’s guidelines for accessible writing. http://www.accessibleinfo.co.uk/pdfs/Making-Myself-Clear.pdf
- Schwartz, A. E., Kramer, J. M., & Longo, A. L. (2018). Patient-reported outcome measures for young people with developmental disabilities: Incorporation of design features to reduce cognitive demands. Developmental Medicine and Child Neurology, 60, 173–184. 10.1111/dmcn.13617 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stancliffe, R. J., Wilson, N. J., Bigby, C., Balandin, S., & Craig, D. (2014). Responsiveness to self-report questions about loneliness: A comparison of mainstream and intellectual disability-specific instruments. Journal of Intellectual Disability Research, 58, 399–405. 10.1111/jir.12024 [DOI] [PubMed] [Google Scholar]
- Stock, S. E., Davies, D. K., & Wehmeyer, M. L. (2004). Internet-based multimedia tests and surveys for individuals with intellectual disabilities. Journal of Special Education Technology, 19, 43–47. 10.1177/016264340401900405 [DOI] [Google Scholar]
- Tennant, A., & Conaghan, P. G. (2007). The Rasch measurement model in rheumatology: What is it and why use it? When should it be applied, and what should one look for in a Rasch paper. Arthritis and Rheumatism, 57, 1358–1362. 10.1002/art.23108 [DOI] [PubMed] [Google Scholar]
- Usability.gov. (n.d.). System Usability Scale (SUS). http://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html
- White, A., Liberatos, P., O’Hara, D., Davies, D. K., & Stock, S. E. (2015). Promoting self-determination in health for people with intellectual disabilities through accessible surveys of their healthcare experiences. Journal of Human Development, Disability, and Social Change, 21, 29–38. [Google Scholar]

Wait for my turn to talk to the waiter or waitress.
Type on a computer
keyboard.
Open a taped box with scissors.
Walk up stairs to the
next floor.
Slide into a booth.
Put books, videos,
papers, or files in alphabetical order.