Abstract
Background:
Patient and family engagement is thought to improve the quality and relevance of child health research. We developed and evaluated the usability of Patient Engagement 101, an e-learning module designed to strengthen the patient-oriented research readiness of health care professionals, researchers, trainees and other stakeholders.
Methods:
The development of Patient Engagement 101 was co-led by a parent and a researcher and overseen by a diverse multistake-holder steering committee. The module was refined and evaluated using a mixed-methods usability testing approach with 2 iterative cycles of semistructured interviews, observations and questionnaires. We collected module feedback by way of semistructured interviews, the validated System Usability Scale, and satisfaction, knowledge and confidence questionnaires. Thematic coding of transcripts and field notes, informed by team discussions, guided the module revisions.
Results:
Thirty end-users completed usability testing (15 per cycle). In each cycle, we modified the module with respect to its content, learner experience, learner-centred design and aesthetic design. Participants were highly satisfied, and System Usability Scale scores indicated the module had the best imaginable usability. Substantial increases in the participants’ knowledge test scores and the confidence to engage in patient-oriented research, but not self-rated knowledge, were observed after module completion.
Interpretation:
Codevelopment with patients and caregivers, and refinement through comprehensive end-user testing, resulted in a training resource with exceptional usability that improved knowledge and confidence to engage in patient-oriented research in child health. Patient Engagement 101 is openly available online, and the methods used to develop and evaluate it may facilitate the creation and evaluation of similar capacity-building resources.
Plain language summary:
Why we did this research
Research teams that include patients and family members can create research that is more useful and better quality. However, some researchers do not know how to work with patients and families. We wanted to teach researchers how to involve patients and families on their team.
What we did
Our team built an online course called “Patient Engagement 101.” We wanted this course to teach researchers the knowledge, skills, and attitudes they need to involve patients in research. To test how well the course worked, we asked 15 people to complete it and tell us what they learned, how satisfied they were with the training, and how confident they felt to involve patients in the future. We changed the course based on their feedback and asked another 15 people to complete it to see if it improved.
What we found
People who completed the new course were happy with it and thought that it was user-friendly. They also said that they felt confident and knowledgeable to work with patients and families in the future.
Take-home message
Our team created a useful online course for clinicians and researchers who want to learn how to involve patients and family members on their research team. To take the course, please visit www.porcch.ca.
Patient-oriented research aims to improve health outcomes by engaging patients and the public as active partners in all stages of research.1 Partnering with patients and their caregivers in child health research can add value and applicability to the research, improve study design, increase its relevance and broaden dissemination.2–4 However, patient engagement involves unique challenges, such as recruiting diverse partners, managing power imbalances and skill gaps, merging varying perspectives and priorities, and sustaining engagement throughout the research process, all of which require additional competencies of clinician-investigators and other research team members.2,5,6 Ensuring all stakeholders are trained and supported to partner effectively is critical to expanding patient-oriented research capacity.7
Tailored training for researchers has been associated with several benefits, including a deeper understanding of patient-oriented research, improved awareness of relevant tools and strategies, and greater self-efficacy to engage patients and carry out specific engagement activities.8–10 Although surveys of researchers show broad interest in patient engagement–related training, several barriers, such as lack of time, unavailability of training opportunities and uncertainty regarding the relevance of patient engagement to their research, have been identified.5 In addition, there is a need for resources that are pediatric-specific that address the complexities of patient engagement in a child health context.2,11 To help bridge this gap, we developed the Patient-Oriented Research Curriculum in Child Health (PORCCH), an open-access online curriculum to strengthen capacity in patient-oriented research in child health, with specialized modules for different stakeholder groups.12 Online learning (e-learning) confers several benefits compared with in-person learning, such as scalability and cost-effective dissemination, remote and asynchronous learning, and the capacity for learners to tailor education to their learning needs.13 Usability, which denotes the effectiveness, efficiency and satisfaction with which users interact with a system for a particular purpose, is key to the evaluation of e-learning quality.14 Our study aimed to develop, refine and evaluate the usability of Patient Engagement 101; the PORCCH e-learning module intended to explain key principles, review practical aspects and best practices, and stimulate research readiness for patient engagement among health care professionals, researchers, trainees and other stakeholders. The module was codeveloped through a shared responsibility between clinicians, researchers, and patients and caregivers, in accordance with the International Association for Public Participation (IAP2) definition of collaboration.15
Methods
Study design and setting
We used a mixed-methods usability testing approach, with 2 cycles of semistructured interviews, observations and questionnaires to develop, refine and evaluate the module. This approach, which has been previously employed for e-learning resources,16,17 involves implementing a design, garnering issues and areas for improvement by way of end-user testing and thematic analysis of feedback, and making subsequent modifications in an iterative manner.18 Our study was conducted at an academic children’s hospital in Toronto, Ontario, with virtual meetings and phone calls used to connect steering committee members and participants residing in other provinces. This report follows the Guidance for Reporting Involvement of Patients and the Public (GRIPP2) reporting checklist.19
Development of Patient Engagement 101
Patient Engagement 101 comprises 2 complementary parts. Part 1 (Foundations of Patient Engagement) includes an overview of international initiatives promoting patient engagement, values and goals of patient engagement, and key elements of effective engagement. Part 2 (Patient Engagement in Practice) discusses practical aspects, unique challenges and various methods of patient engagement. The module was created with Storyline 360 (Articulate Global Inc.), an e-learning development application, following Mayer’s principles of multimedia design20 and plain language writing recommendations.21 Each part takes about 30 minutes to complete and includes interactive tools, video vignettes, assessment exercises, links to additional resources and certificates of completion.
Participants
Through study advertisements distributed by way of newsletters and email lists, English-speaking health care professionals, researchers, research staff, trainees, patients and family members were recruited from the Canadian Child Health Clinician Scientist Program, CHILD-BRIGHT Network, Strategy for Patient-Oriented Research (SPOR) SUPPORT Units and family advisory networks across Canada. A maximum variation purposive sampling approach was employed to ensure diversity in the testing group, particularly with respect to participants’ role (e.g., researcher, clinician-researcher, patient or caregiver), patient engagement in research experience and geographic location.22,23 Researchers and clinician-researchers, the module’s target audience, were predominantly sampled, alongside a smaller number of patients and caregivers for the complementary knowledge and perspectives of these key stakeholders.
Study procedures
Usability testing sessions were carried out by a research assistant with graduate training in psychology (G.A.M.), either in person or over the phone, between February and May 2020. At baseline, participants completed a demographic form (Appendix 1A available at www.cmajopen.ca/content/10/4/E872/suppl/DC1) and questionnaire on confidence to engage in patient-oriented research, developed according to Bandura’s framework for constructing self-efficacy scales (Appendix 1B).24 In addition, participants completed a multiple-choice knowledge test on patient engagement that was designed to target assessment of the “knows how” level of Miller’s framework for assessing levels of clinical competence (i.e., interpretation, application of knowledge)25 (Appendix 1C) and self-rated their knowledge of patient engagement on a 5-point Likert scale. These questionnaires were pilot-tested by 3 child health researchers or clinician-researchers to verify clarity and content validity.
Participants subsequently undertook a usability testing session during which they were asked to complete Patient Engagement 101 and think aloud regarding what they liked and disliked about the module. At various points, the research assistant asked questions to facilitate interpretation of loud thought, elicit the participant’s understanding of the module and solicit suggestions for improvement. Field notes were also recorded.
After completing Patient Engagement 101, participants engaged in a semistructured interview (about 20 min) designed to elicit their perceptions of the module’s usability (Appendix 1D). Participants then completed the same confidence questionnaire, knowledge test and overall knowledge rating as they had done before. In addition, they completed an e-learning satisfaction questionnaire (Appendix 1E) and the System Usability Scale (SUS), a validated 10-item questionnaire for evaluating user satisfaction of technologies.26 System Usability Scale scores range from 0 to 100, with scores of 71.4, 85.5 and 90.9 corresponding to overall usability ratings of “good,” “excellent” and “best imaginable,” respectively.27
After the first usability testing cycle, changes were made to the module prototype based on themes identified through content analysis of the usability testing sessions, field notes and questionnaires. A second cycle was then conducted to garner any further suggestions.
Patient engagement
Development of Patient Engagement 101 was co-led in equal partnership by a parent (F.B.) and a researcher with expertise in patient-oriented research (C.M.). Module co-leads met monthly over 9 months to identify and collate relevant peer-reviewed and grey literature, integrate it into the curriculum and draft the module. Feedback on module content, design and development was provided by a diverse steering committee that included 2 clinician-researchers, 2 SPOR SUPPORT Unit leads, 3 parent partners, a knowledge translation expert, an educational researcher and 2 instructional design experts. The parents on the steering committee and the parent co-lead all had lived experience with a child with a long-term health condition and were selected from established family advisory networks at 2 children’s hospitals. During usability testing, the entire study team met 2–3 times per cycle to clarify and critique module feedback and discuss and refine the evolving themes and framework.
Data analysis
Recorded testing sessions were transcribed verbatim, de-identified and coded using a qualitative data analysis program to facilitate data organization and analysis (Dedoose, SocioCultural Research Consultants). After each testing cycle, 2 coders (G.A.M., L.A.) read the transcripts to identify preliminary codes regarding usability (e.g., satisfaction, efficiency, learnability and errors), which were then refined through systematic iterative coding and sorting using the constant comparison method and then grouped into usability-related themes and subthemes.28 We used the thematic framework from a previous e-learning usability study17 and published usability attributes29–32 to guide the analysis. To enhance the trustworthiness of the findings,33 we encouraged reflexivity by having the team of patients, clinicians and nonclinician researchers engage in a dialogue to question and challenge each other’s assumptions throughout analysis. Team members not directly involved in coding reviewed the emerging themes and used their expertise in education, child health and research to help clarify and critique the findings. We purposively selected a sample size of 15 participants per usability testing cycle based on a usability perspective to ensure thematic saturation.34,35
Statistical analysis
We summarized data from the demographic, satisfaction, confidence and knowledge questionnaires using means, standard deviations (SDs) and proportions. Paired t tests were used to assess before and after changes in confidence and knowledge (α = 0.05, 2-sided). Quantitative analyses were conducted in R version 4.0.0 (R Core Team).
Ethics approval
The study received approval from the Research Ethics Board at The Hospital for Sick Children (Toronto).
Results
Usability testers for Patient Engagement 101 comprised 14 clinician-researchers in child health, 12 researchers in child health, 2 former pediatric patients and 2 caregivers (Table 1). We found that module completion time was similar across cycles, with a mean completion time of 65 (SD 12) minutes in cycle 1 and 71 (SD 12) minutes in cycle 2.
Table 1:
Participant characteristics
| Characteristic | No. of participants* in cycle 1† n = 15 |
No. of participants* in cycle 2† n = 15 |
|---|---|---|
| Primary role | ||
| Caregiver | 1 | 1 |
| Child health clinician-researcher | 7 | 7 |
| Child health researcher | 6 | 6 |
| Patient | 1 | 1 |
| Sex | ||
| Female | 14 | 15 |
| Male | 1 | 0 |
| Geographic region | ||
| Alberta | 3 | 1 |
| British Columbia | 0 | 2 |
| Manitoba | 0 | 3 |
| Newfoundland and Labrador | 0 | 1 |
| Nova Scotia | 3 | 2 |
| Ontario | 9 | 6 |
| Highest education level completed | ||
| College or university | 2 | 1 |
| Masters | 3 | 9 |
| MD or PhD | 10 | 5 |
| Have previously engaged in patient-oriented child health research | ||
| No | 4 | 6 |
| Yes | 11 | 9 |
| Have used e-learning before | ||
| No | 1 | 0 |
| Yes | 14 | 15 |
| Comfort level using a computer, mean ± SD‡ | 4.7 ± 0.5 | 4.9 ± 0.4 |
| Comfort level using the Internet, mean ± SD‡ | 4.9 ± 0.4 | 4.7 ± 0.5 |
Note: SD = standard deviation.
Unless specified otherwise.
The e-learning module was refined through 2 iterative cycles of usability testing and module revisions based on participant feedback, with 15 different usability testers per cycle.
Rated on a 1 (do not know how to use it) to 5 (extremely comfortable) Likert scale.
E-learning module usability
Qualitative analysis identified 4 key themes related to usability (outlined below). Illustrative quotes and example module changes are presented in Table 2 and Appendix 1F.
Table 2:
Usability testing feedback and corresponding module changes*
| Topic | Quote | Corresponding module change |
|---|---|---|
| Content | ||
| Quantity: the amount of information contained in the module or repetition of information | “It was very well-organized and logically sequenced. I liked that the different methods of engagement were linked to the spectrum of engagement, which helped deepen my understanding of it.” (P5, C1) | – |
| Relevance: the relevance of the module to its intended users | “As a patient, I feel very well represented in this module and I feel like my mom can empathize with a lot of what [one of the caregivers] was saying [about barriers to] getting involved in research studies. It’s literally, do you have the time to have a full-time job, be a caregiver, and do something extra on top of it.” (P14, C1) “I do think that unless you’re already doing co-design, and maybe if you are doing co-design, I still think it’s a good refresher or there are things that you can learn.” (P11, C1) |
– |
| Understandability: readability, use of plain language, explanation of important terminology, etc. | “You guys outlined CIHR and PCORI earlier, but I am not familiar with them, so I’d have to look them up, do you have a link on them?” (P13, C1) “I’m glad you had that pop-up [definition] for lived experience — to define that was really good.” (P22, C2) |
Added a new tool box with descriptions and links for more information on each national initiative to promote patient and public involvement in research. |
| Usefulness: how useful the information is or who the information would be useful for | “[Partners] need to get compensated in a way that’s meaningful to them. Everybody likes money, but sometimes for youth volunteer hours are more valuable. Co-authorship has value for some, no value whatsoever for others.” (P10, C1) “I think the examples given around compensation are more reimbursements, but the definition of compensation says it is not to be confused with covering expenses.” (P6, C1) “A question I see coming up is the difference between doing a focus group where it is a qualitative study and people are participants versus partners. I think that focus groups, described that way, blurs that line a little bit.” (P6, C1) |
Updated the glossary definition for compensation to clarify the distinction between covering expenses and compensation for engagement, and also added additional resources on patient partner remuneration. Modified the definition for focus groups to exclusively describe it as a method for capturing partner insights to inform the conduct or design of a research project, compared with a form of qualitative data collection. |
| Learner experience | ||
| Preference for information access: users’ preferences regarding information access | “Where is Research 101? Does it tell you why you got [the knowledge check question] wrong?” (P13, C1) | Changed the method of feedback for the “select all that apply” knowledge check questions, from a reference to another module to an explicit explanation of the correct and incorrect responses. |
| Satisfaction: user satisfaction with the module | “I think you get a lot out of it for the relatively small [time] investment that you’re making to go through it, especially with the tool box takeaways as well.” (P9, C2) | – |
| Module length: time required to complete the module | “Something like this is a lot more palatable for somebody who feels that maybe they want to learn a little bit more, but they’re not going to commit multiple days or even half a day on something.” (P20, C2) | – |
| Engagement: how engaged users are as they proceed through the module | “I think it’s good to have [users click to advance the module] because it’s like little bite-sized pieces and then you have to actively move it forward. It’s like protection against inattention.” (P3, C1) | – |
| Attributes of learner-centred design | ||
| Ease of use: how users perceive the ease of use and functionality of the module | “Some people get a bit anxious about how much time it’s going to take [...] just reassure people that it’s short and won’t take them that long.” (P27, C2) | Clearly posted the estimated time required to complete both parts of Patient Engagement 101 on the PORCCH website. |
| Intuitive design: the ease at which users know what to do next | “I guess I kind of forgot about the tool boxes, or maybe didn’t pay close enough attention to that navigation slide, so I thought, ‘oh what’s this here for’.” (P1, C1) | Added additional audiovisual prompts to provide navigational support and remind users about the interactive features. |
| Design aesthetic | ||
| Navigation: the ability of the user to easily move around the module | “I wasn’t sure how to get out of that section. I noticed there was a Previous Slide button, but I thought to myself ‘Do I have to click it 4 times to get back?’” (P11, C1) | Added audiovisual navigational prompts to places in the module that participants found confusing or difficult to navigate. |
| Visual assets: videos, graphics and animations in the module | “I found that video really illuminating, as there were a lot of elements that the [parent partner] talked about that I see are lacking in our study, but which might influence changes to protocol or engagement practices for future studies.” (P18, C2) | – |
Note: C = usability cycle, CIHR = Canadian Institutes of Health Research, P = participant, PCORI = Patient-Centered Outcomes Research Institute, PORCCH = Patient-Oriented Research Curriculum in Child Health.
The e-learning module was refined through 2 iterative cycles of usability testing and module revisions based on participant feedback, with 15 different usability testers per cycle.
Content
We categorized the content domain into subthemes of quantity, completeness, quality and trustworthiness, relevance, understandability and usefulness. Participants thought the module included the most important topics relevant to patient engagement and discussed them in a manner that was complex enough to appeal to clinicians and researchers yet understandable by interested youth or adults without a formal research background. Participant gaps in understanding and requests for additional information during testing led us to add more glossary terms (e.g., tokenism, patient-reported outcome measures) and external resources (e.g., budget tools, patient engagement plan templates and patient-oriented research vignettes), and a new slide highlighting unique considerations for patient engagement in child health research to support learning (Figure 1).
Figure 1:
A new slide and an additional resources tool box on the unique considerations for patient and family engagement in child health research was added to part 1 of the module, based on comments from participants in cycle 1.
Learner experience
Learner experience comprised subthemes of satisfaction, module length, motivation, engagement and preference for information access. Participants liked the brevity of the module and felt it would appeal to a wide audience in child health research. They also liked having to advance frequently the module by way of clicking, which helped sustain engagement. Participants varied in how quickly they wanted to proceed through the module, with some speed reading and others frequently pausing to reflect. Speedier participants were sometimes frustrated by sections that required them to wait (e.g., latencies before interactive elements became clickable); we modified these to be immediately interactive.
Participants valued the tool boxes with additional resources summarizing patient engagement tools and publications in plain language, with links to original materials, for providing extra information on key topics without detracting from the experience of those who chose not to view the tool box resources.
Attributes of learner-centred design
Attributes of learner-centred design encompassed ease of use, intuitive design and learnability. Overall, participants found Patient Engagement 101 to be user friendly. They appreciated its division into 2 complementary parts as well as the ability to pause and resume the module at a later date. Although most participants quickly learned how to use the navigational controls and interactive features, we identified a few sections that were not sufficiently intuitive and the need to provide subsequent reminders about the additional resource tool boxes.
Design aesthetic
Aspects of design aesthetic included multimedia components, module features, navigation and visual assets. Most participants enjoyed listening to the module’s narration, which was recorded by a professional voice talent, as this slowed their progression through the module, providing natural opportunities between slides to reflect on the material and its applications to their own research. Participants appreciated the module’s navigational aids (e.g., colourcoded sections, animated checkmarks to show completed sections), which facilitated navigation and permitted greater focus on the content.
Visual assets in the module (e.g., diagrams, animations, videos) were appreciated for adding diversity to the learning materials. In particular, the videos of stakeholders’ experiences were well received by participants for being engaging and providing powerful insights regarding perspectives, motivations and challenges to patient-oriented research.
In cycle 1, a few participants inadvertently skipped sections by double-clicking a button that required only a single click. To prevent this, we throttled the click rate in the module, allowing for any number of clicks to register only as a single click, and this issue did not reoccur. Other minor issues, such as language-related errors and audio-volume imbalances, were also identified and subsequently fixed.
Overall, SUS scores for Patient Engagement 101 were high (cycle 1: mean 92.0 [SD 6.0]; cycle 2: mean 91.5 [SD 8.6]), corresponding to best imaginable usability.
E-learning evaluations of Patient Engagement 101 were positive (Table 3). In cycle 2, the mean overall satisfaction with the education was 4.7 (SD 0.5), out of 5.
Table 3:
E-learning module feedback
| Question* | Participant rating | |
|---|---|---|
| Cycle 1,† mean ± SD n = 15 |
Cycle 2,† mean ± SD n = 15 |
|
| I learned something new | 4.2 ± 0.8 | 4.1 ± 1.0 |
| The information I received was easy to understand | 4.9 ± 0.4 | 4.5 ± 0.5 |
| I received the right amount of information | 4.6 ± 0.7 | 4.4 ± 0.5 |
| My questions were answered | 4.4 ± 0.9 | 4.3 ± 0.6 |
| The goals of the session were clear | 4.7 ± 0.6 | 4.6 ± 0.5 |
| The length of time it took to finish the e-learning module was good | 4.6 ± 0.6 | 4.3 ± 0.8 |
| The e-learning module was easy to use | 4.8 ± 0.4 | 4.6 ± 0.5 |
| Overall, how satisfied were you with your education?‡ | 4.8 ± 0.4 | 4.7 ± 0.5 |
| Overall, how enjoyable was your education?‡ | 4.5 ± 0.5 | 4.5 ± 0.5 |
Note: SD = standard deviation.
Rated on a 1 (low agreement) to 5 (high agreement) Likert-type scale.
The e-learning module was refined through 2 iterative cycles of usability testing and module revisions based on participant feedback, with 15 different usability testers per cycle.
Rated on a 1 (not at all) to 5 (very) Likert-type scale.
In each cycle, participants’ knowledge test scores and confidence to engage in patient-oriented research increased significantly after completing Patient Engagement 101 (p < 0.05, Table 4). Self-reported knowledge of patient-oriented research did not change significantly (p > 0.05).
Table 4:
Pre- and postmodule completion differences in confidence to engage in patient-oriented research, knowledge test scores and self-reported knowledge
| Outcome | Participant premodule rating, mean ± SD | Participant postmodule rating, mean ± SD | Difference, mean ± SD | p value |
|---|---|---|---|---|
| Confidence* | ||||
| Cycle 1† | 65.9 ± 24.7 | 90.3 ± 7.6 | 24.4 ± 22.3 | < 0.001 |
| Cycle 2† | 71.1 ± 18.9 | 90.5 ± 5.6 | 19.5 ± 17.2 | < 0.001 |
| Knowledge test scores‡ | ||||
| Cycle 1† | 14.5 ± 1.6 | 15.7 ± 1.3 | 1.2 ± 1.4 | < 0.01 |
| Cycle 2† | 14.7 ± 1.8 | 15.6 ± 1.3 | 0.9 ± 1.4 | < 0.05 |
| Self-reported knowledge§ | ||||
| Cycle 1† | 3.7 ± 1.1 | 3.8 ± 1.2 | 0.1 ± 0.5 | > 0.05 |
| Cycle 2† | 3.5 ± 0.8 | 3.9 ± 0.7 | 0.4 ± 0.8 | > 0.05 |
Note: SD = standard deviation.
Possible scores range from 0 to 100, with higher scores indicating greater confidence to carry out patient engagement.
The e-learning module was refined through 2 iterative cycles of usability testing and module revisions based on participant feedback, with 15 different usability testers per cycle.
Possible scores range from 0 to 17, with higher scores indicating greater knowledge of patient engagement.
Rated on a 1 (do not know anything about patient-oriented child health research) to 5 (extremely knowledgeable) Likert-type scale.
Interpretation
Patient Engagement 101 was codeveloped with patients and caregivers and refined through comprehensive end-user testing, resulting in an open-access, e-learning resource with exceptional usability that substantially increased knowledge and confidence to engage in patient-oriented research in child health. In the first 12 months (June 2021 to May 2022), the PORCCH website (www.porcch.ca) had over 50 000 unique site visitors, with over 380 users enrolled in Patient Engagement 101.
Patient Engagement 101 adds to a growing landscape of effective capacity-building resources on patient and public involvement for researchers and other stakeholders. In-person and online workshops have traditionally been a popular approach to providing support.8–10,36,37 In the United Kingdom, a program of workshops to build confidence and skills, delivered at a large biomedical research centre, attracted more than 700 attendees across 72 workshops over 5 years.10 Pre- and postworkshop surveys showed marked increases in attendees’ understanding of, and confidence in, carrying out engagement activities. INNOVATE Research delivered a youth engagement capacity-building intervention by way of a 1-day workshop, run at 3 Canadian academic research institutions.8 Six months later, attendees reported greater familiarity and self-efficacy to engage youth, and more frequent engagement of youth in their teaching and as conference copresenters.8
Longer-term initiatives, such as engagement-related training embedded into graduate programs,38,39 coaching programs and communities of practice,40 and standalone accredited programs,41 are emerging. A 2020 study evaluated a 1-year studentship program for graduate health sciences students in Alberta that included funding, specialized training, and self-selected networking and mentorship opportunities.39 Awardees’ final impact narrative reports were analyzed to identify program benefits, which included the development of engagement-related skills and leadership, collaborations and partnerships, new perspectives on patient-oriented research and modifications to their research projects and career goals. In a 2018 study, these same authors also evaluated an 18-week, 56-hour blended learning program intended to integrate patient-oriented research into clinical trials, initially completed by 22 clinical trials staff.41 At program completion, 15 learners reported changes to increase patient engagement in their trials.
These findings, in concert with the improvements in knowledge and confidence after completion of Patient Engagement 101, suggest that brief training opportunities are useful to disseminate best practices and increase researchers’ confidence to carry out engagement-related activities. However, deep integration of patient engagement into a research program may require more intensive capacity-building opportunities that incorporate mentorship, networking opportunities and practical guidance.42 Although participants’ knowledge test scores increased after module completion, their self-reported knowledge did not, which suggests that the knowledge test may not have captured all relevant domains or that participants had unperceived needs related to patient-oriented research.43
The development and provision of institutional training to build capacity in patient engagement requires substantial financial and human resources,37 which can result in tensions between patient-oriented research values such as inclusivity and operational concerns in relation to cost recovery.44 It should also be noted that the training resources in the peer-reviewed literature likely represent the tip of the iceberg of relevant training materials, given that capacity development is a cornerstone of national frameworks on patient-oriented research.1,36 To efficiently build capacity in patient-oriented research, there is a growing need to coherently evaluate training materials, identify high-quality resources and determine how best to integrate them into larger curricula.7
Limitations
Some limitations to our study should be noted. The sample size, although sufficient from a usability testing perspective, was not large enough to permit investigation of participant characteristics on usability ratings or other outcomes. Participants, who were recruited through pediatric research and family advisory networks for their familiarity with patient-oriented research, may not fully represent the intended end users of Patient Engagement 101. In addition, the self-report measures used to evaluate the impact of the module on knowledge and confidence were chosen to minimize participant burden. Evaluating long-term outcome, such as whether completion of Patient Engagement 101 is associated with greater engagement of patients and caregivers in research, in a larger and more representative sample that reflects the diversity (e.g., education, ethnicity, gender identity, sex, people with disabilities) of those involved in patient-oriented research, would be useful. Finally, PORCCH is available only in English at present; however, French translation efforts have begun.
Lessons learned from patient involvement
Patients and caregivers on the steering committee contributed important insights on key topics, such as the representativeness of patient partners, the plurality of perspectives in child health (e.g., patient, parent, caregiver, siblings, grandparents, teachers), and how to build and sustain authentic and meaningful partnerships. They also provided feedback on the content and design of module prototypes, and helped interpret the findings and review emerging themes throughout testing. Codeveloping a patient-oriented research curriculum with patients and caregivers enhanced the quality and credibility of the curriculum.37 The parent co-lead of the module (F.B.), who was a member of the family advisory network at an academic children’s hospital, began graduate work in patient engagement during the module development, has subsequently completed a PhD and has been hired as a patient and family engagement in research coordinator at the same hospital.
Conclusion
Patient Engagement 101 — part of PORCCH, an open-access, online curriculum — may be useful in a variety of educational contexts, including graduate curricula, research institute onboarding programs and professional development for researchers who are new to patient-oriented research in child health. In addition, the methods used to create and evaluate PORCCH may help guide the development and evaluation of other online resources. It is our hope that Patient Engagement 101 will help build capacity in patient-oriented research in child health among health care professionals, researchers, trainees and other stakeholders.
Supplementary Material
Acknowledgement
The authors wish to thank Pathways Training and eLearning for their e-learning support.
Footnotes
Competing interests: None declared.
Contributors: Francine Buchanan and Colin Macarthur are co–senior authors. Both of these authors contributed equally to the work. Catharine Walsh, Nicola Jones, Francine Buchanan and Colin Macarthur conceived and designed the study. Catharine Walsh, Graham McCreath and Veronik Connan acquired the data. Catharine Walsh and Graham McCreath drafted the manuscript. All of the authors analyzed and interpreted the data, revised the manuscript critically for important intellectual content, gave final approval of the version to be published and agreed to be accountable for all aspects of the work.
Data sharing: Study data are available upon request by contacting the corresponding author (Catharine Walsh).
Supplemental information: For reviewer comments and the original submission of this manuscript, please see www.cmajopen.ca/content/10/4/E872/suppl/DC1.
Funding: The study was funded by a Canadian Institutes of Health Research Strategy for Patient-Oriented Research (SPOR) — Patient-Oriented Research Collaboration Grant no. 397481 (matching funds were provided by CHILD-BRIGHT, the BC SUPPORT Unit, the Canadian Child Health Clinician Scientist Program, and the Ontario Child Health SUPPORT Unit). Catharine Walsh holds an Early Researcher Award from the Ontario Ministry of Research and Innovation. The funder had no role in the design and conduct of the study, decision to publish, or preparation, review or approval of the manuscript.
References
- 1.Strategy for patient-oriented research: patient engagement framework. Ottawa: Canadian Institutes of Health Research; [accessed 2022 Apr. 13]. Available: https://cihr-irsc.gc.ca/e/documents/spor_framework-en.pdf. [Google Scholar]
- 2.Flynn R, Walton S, Scott SD. Engaging children and families in pediatric health research: a scoping review. Res Involv Engagem. 2019;5:32. doi: 10.1186/s40900-019-0168-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Rouncefield-Swales A, Harris J, Carter B, et al. Children and young people’s contributions to public involvement and engagement activities in health-related research: a scoping review. PLoS One. 2021;16:e0252774. doi: 10.1371/journal.pone.0252774. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Shen S, Doyle-Thomas KAR, Beesley L, et al. How and why should we engage parents as co-researchers in health research? A scoping review of current practices. Health Expect. 2017;20:543–54. doi: 10.1111/hex.12490. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Crockett LK, Shimmin C, Wittmeier KDM, et al. Engaging patients and the public in Health Research: experiences, perceptions and training needs among Manitoba health researchers. Res Involv Engagem. 2019;5:28. doi: 10.1186/s40900-019-0162-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Frisch N, Atherton P, Doyle-Waters MM, et al. Patient-oriented research competencies in health (PORCH) for researchers, patients, healthcare providers, and decision-makers: results of a scoping review. Res Involv Engagem. 2020;6:4. doi: 10.1186/s40900-020-0180-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Strategy for patient-oriented research: capacity development framework. Ottawa: Canadian Institutes of Health Research; 2015. [accessed 2022 Apr. 13]. Available: https://cihr-irsc.gc.ca/e/documents/spor_capacity_development_framework-en.pdf. [Google Scholar]
- 8.Hawke LD, Darnay K, Brown M, et al. INNOVATE Research: impact of a workshop to develop researcher capacity to engage youth in research. Health Expect. 2020;23:1441–9. doi: 10.1111/hex.13123. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.McKenzie A, Alpers K, Heyworth J, et al. Consumer and community involvement in health and medical research: evaluation by online survey of Australian training workshops for researchers. Res Involv Engagem. 2016;2:16. doi: 10.1186/s40900-016-0030-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Yu R, Hanley B, Denegri S, et al. Evaluation of a patient and public involvement training programme for researchers at a large biomedical research centre in the UK. BMJ Open. 2021;11:e047995. doi: 10.1136/bmjopen-2020-047995. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Bradbury-Jones C, Taylor J. Engaging with children as co-researchers: challenges, counter-challenges and solutions. Int J Soc Res Methodol. 2015;18:161–73. [Google Scholar]
- 12.Macarthur C, Walsh CM, Buchanan F, et al. Development of the patient-oriented research curriculum in child health (PORCCH) Res Involv Engagem. 2021;7:27. doi: 10.1186/s40900-021-00276-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Ruiz JG, Mintzer MJ, Leipzig RM. The impact of E-learning in medical education. Acad Med. 2006;81:207–12. doi: 10.1097/00001888-200603000-00002. [DOI] [PubMed] [Google Scholar]
- 14.Schoeffel R. The concept of product usability. ISO Bull. 2003;34:6–7. [Google Scholar]
- 15.Public participation spectrum. Denver (CO): International Association for Public Participation (IAP2); 2004. [accessed 2022 Apr. 13]. Available: https://iap2canada.ca/Resources/Documents/0702-Foundations-Spectrum-MW-rev2(1).pdf. [Google Scholar]
- 16.Stinson J, Gupta A, Dupuis F, et al. Usability testing of an online self-management program for adolescents with cancer. J Pediatr Oncol Nurs. 2015;32:70–82. doi: 10.1177/1043454214543021. [DOI] [PubMed] [Google Scholar]
- 17.Connan V, Marcon MA, Mahmud FH, et al. Online education for gluten-free diet teaching: development and usability testing of an e-learning module for children with concurrent celiac disease and type 1 diabetes. Pediatr Diabetes. 2019;20:293–303. doi: 10.1111/pedi.12815. [DOI] [PubMed] [Google Scholar]
- 18.Snodgrass A, Coyne R. Models, metaphors and the hermeneutics of designing. Issues. 1992;9:56–74. [Google Scholar]
- 19.Staniszewska S, Brett J, Simera I, et al. GRIPP2 reporting checklists: tools to improve reporting of patient and public involvement in research. BMJ. 2017;358:j3453. doi: 10.1136/bmj.j3453. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Mayer RE. Applying the science of learning to medical education. Med Educ. 2010;44:543–9. doi: 10.1111/j.1365-2923.2010.03624.x. [DOI] [PubMed] [Google Scholar]
- 21.The Canadian style: plain language. Ottawa: Public Works and Government Services Canada; 2015. [accessed 2022 Apr. 13]. Available: https://www.btb.termiumplus.gc.ca/tcdnstyl-chap?lang=eng&lettr=chapsect13&info0=13. [Google Scholar]
- 22.Lavelle E, Vuk J, Barber C. Twelve tips for getting started using mixed methods in medical education research. Med Teach. 2013;35:272–6. doi: 10.3109/0142159X.2013.759645. [DOI] [PubMed] [Google Scholar]
- 23.Teddlie C, Yu F. Mixed methods sampling: a typology with examples. J Mixed Methods Res. 2007;1:77–100. [Google Scholar]
- 24.Bandura A. Guide for constructing self-efficacy scales. In: Pajares F, Urdan T, editors. Self-efficacy beliefs of adolescents. Greenwich (CT): Information Age Publishing; 2006. pp. 307–37. [Google Scholar]
- 25.Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:S63–7. doi: 10.1097/00001888-199009000-00045. [DOI] [PubMed] [Google Scholar]
- 26.Bangor A, Kortum PT, Miller JT. An empirical evaluation of the system usability scale. Int J Hum Comput Interact. 2008;24:574–94. [Google Scholar]
- 27.Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: adding an adjective rating scale. J Usability Stud. 2009;4:114–23. [Google Scholar]
- 28.Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3:77–101. [Google Scholar]
- 29.Nielsen J. Usability engineering. San Diego (CA): Academic Press; 1993. [Google Scholar]
- 30.Shackel B. Usability — context, framework, design and evaluation. In: Shackel B, Richardson SJ, editors. Human factors for informatics usability. Cambridge (UK): Cambridge University Press; 1991. pp. 21–38. [Google Scholar]
- 31.Koohang A, Du Plessis J. Architecting usability properties in the e-learning instructional design process. Int J E-learning. 2004;3:38–44. [Google Scholar]
- 32.ISO/IEC 9126 Software product evaluation — quality characteristics and guidelines for their use. Geneva (CH): International Organization for Standardization/International Electrotechnical Commission (ISO/IEC); 1991. [Google Scholar]
- 33.Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15:1277–88. doi: 10.1177/1049732305276687. [DOI] [PubMed] [Google Scholar]
- 34.Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform. 2004;37:56–76. doi: 10.1016/j.jbi.2004.01.003. [DOI] [PubMed] [Google Scholar]
- 35.Saunders B, Sim J, Kingstone T, et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant. 2018;52:1893–907. doi: 10.1007/s11135-017-0574-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Developing INVOLVE training and support for public involvement in research. London (UK): National Institute for Health Research; 2012. [accessed 2022 Apr. 13]. Available: https://www.invo.org.uk/wp-content/uploads/2015/06/8774-INVOLVE-Training-Support-WEB2.pdf. [Google Scholar]
- 37.Bell T, Vat LE, McGavin C, et al. Co-building a patient-oriented research curriculum in Canada. Res Involv Engagem. 2019;5:7. doi: 10.1186/s40900-019-0141-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Foley L, Kiely B, Croke A, et al. A protocol for the evaluation of the process and impact of embedding formal and experiential public and patient involvement training in a structured PhD programme. J Comorb. 2021;11:26335565211024793. doi: 10.1177/26335565211024793. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Rosario MK, Hebert MA, Sahota BK, et al. Capacity development in patient-oriented research: programme evaluation and impact analysis. Health Res Policy Syst. 2020;18:89. doi: 10.1186/s12961-020-00606-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.de Wit M, Beurskens A, Piškur B, et al. Preparing researchers for patient and public involvement in scientific research: development of a hands-on learning approach through action research. Health Expect. 2018;21:752–63. doi: 10.1111/hex.12671. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Rosario MK, Hebert M, Hill MD, et al. Developing a skilled clinical trials workforce in patient-oriented research: impact of an innovative training approach. Educ Res Appl ERCA-155. 2018:3. doi: 10.29011/2575-7032/100055.. [DOI] [Google Scholar]
- 42.Hawke LD, Darnay K, Relihan J, et al. Enhancing researcher capacity to engage youth in research: researchers’ engagement experiences, barriers and capacity development priorities. Health Expect. 2020;23:584–92. doi: 10.1111/hex.13032. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Moore DE, Chappell K, Sherman L, et al. A conceptual framework for planning and assessing learning in continuing education activities designed for clinicians in one profession and/or clinical teams. Med Teach. 2018;40:904–13. doi: 10.1080/0142159X.2018.1483578. [DOI] [PubMed] [Google Scholar]
- 44.Minogue V, Donskoy AL. Developing a training package: lessons in partnership-working between health professionals, service users and carers. Int J Health Care Qual Assur. 2017;30:458–66. doi: 10.1108/IJHCQA-06-2016-0084. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.

