Abstract
Autistic transition-age youth experience high rates of unemployment and underemployment, in part due to the social challenges they may face when having conversations in the workplace. In an effort to help enhance conversational abilities in the workplace, our collaborative team partnered to develop WorkChat: A Virtual Workday. Specifically, our team of scientists, community partners, and diversity and inclusion experts participated in a community-engaged process to develop WorkChat using iterative feedback from autistic transition-age youth and their teachers. With initial development complete, this study reports on the protocol that our collaborative team developed, reviewed, and approved to conduct a randomized controlled trial (RCT) to evaluate the real-world effectiveness and initial implementation process outcomes of WorkChat when integrated into post-secondary pre-employment transition services (Pre-ETS). Our aims are to: 1) evaluate whether services-as-usual in combination with WorkChat, compared to services-as-usual with an attention control, enhances social cognition and work-based social ability (between pre- and post-test); reduces anxiety about work-based social encounters (between pre- and post-test), and increases sustained employment by 9-month follow-up; 2) evaluate whether social cognitive ability and work-based social ability mediate the effect of WorkChat on sustained employment; and 3) conduct a multilevel, mixed-method process evaluation of WorkChat implementation.
Keywords: Autism, Employment services, Social ability, Intervention, Implementation science
1. Introduction
Nearly 50,000 autistic youth transition from high school to adult life each year [1,2] with only 25% of these transition-age youth getting jobs within two years of graduation [3]. The ability to sustain employment is even more challenging when their communication skills do not meet the expectations of an allistic (i.e., non-autistic) world [[4], [5], [6]]. For autistic people, additional challenges include the limited availability of and access to evidence-based practices to facilitate conversational skills, and the lack of training provided by employers to contextualize how coworkers can communicate more effectively with an autistic person [[7], [8], [9]]. Thus, autistic transition-age youth can be at a disadvantage when communicating with customers, coworkers, and supervisors in a work setting, which research has identified as a critical barrier to successfully sustaining employment [10].
Subsequently, unemployment and job loss have a damaging ripple effect on the mental health of autistic transition age-youth [11,12]. Notably, a critical gap in federally mandated services [13] to support autistic youth as they transition from school to adult life is the lack of evidence-based practices to enhance their work-based social ability (e.g., conversational skills with customers, coworkers, and supervisors). Although some interventions are emerging with preliminary effectiveness at enhancing work-based social ability [14,15], whether these tools translate to sustained employment is still unknown. Given that autistic transition-age youth report computerized training tools (developed by both our team [[16], [17], [18], [19], [20]] and others [[21], [22], [23]]) are highly acceptable and improve their real-world outcomes, we propose to address this critical barrier to sustained employment by developing and evaluating a novel and scalable computerized simulation focused on enhancing core work-based social skills via practicing conversations with customers, coworkers and supervisors.
Specifically, our team used evidence-based, community-engaged methods [16] to design “WorkChat: A Virtual Workday”. Our intervention development team (MJS, KS, CS, EW, SE) partnered with SIMmersion LLC (DO, JE, LH, CS) to develop a prototype design of WorkChat that was then iteratively reviewed by a community advisory board (CB, JB, DK, SM, DT, ST), diversity advisory board (SD, SKK, CL, ER, TL, SM, TW), and scientific advisory board (MB, TD, KH, JS) for feedback and recommendations. Notably, our use of a diversity advisory board is in response to the historical underrepresentation of racial, ethnic, sexual and gender minorities in autism research and intervention development [24,25]. Thus, this board focused on reviewing the diversity and inclusiveness of the WorkChat components using their expert lens [26], also discussing diversity related to size, age, presentation, and disability status in addition to race and gender. After implementing initial design recommendations, the intervention development team and SIMmersion prioritized and implemented advisory board recommendations and then recruited 18 autistic transition-age youth and 12 pre-employment transition teachers to review the prototype. Then, they completed a brief survey and interview to share their recommendations to enhance WorkChat's design with an emphasis on dissemination. The intervention development team and SIMmersion then reviewed and prioritized these recommendations, with SIMmersion implementing the recommended design enhancements. Once the beta version of WorkChat was completed, each advisory board reviewed this version and provided final recommendations that were prioritized and implemented. A detailed review of the WorkChat design will be presented in the methods.
The proposed study will take several important steps to test whether WorkChat is effective at improving social and employment outcomes for autistic transition-age youth engaged in post-secondary pre-employment transition services (Pre-ETS). The first step will evaluate how enhancing Pre-ETS with WorkChat affects individual-level outcomes (e.g., social cognitive ability, work-based social ability, sustained employment) and system-level outcomes (e.g., labor costs to prepare for implementation, return on investment). The second step will evaluate the initial implementation process outcomes focused on acceptability, usability, appropriateness, feasibility, and fidelity of delivery. The final step will explore social anxiety, social cognitive ability, and work-based social ability as potential mechanisms for sustained employment between completing Pre-ETS and a nine-month follow-up.
2. Methods
2.1. Study design
This two-arm parallel intent-to-treat randomized control trial (RCT) includes autistic participants ages 18–26 years old who are enrolled at Michigan Career and Technical Institute (MCTI) and randomized to receive either post-secondary pre-employment transition services along with an attention control (Pre-ETS + AC) or Pre-ETS + WorkChat. The Institutional Review Board at the University of Michigan reviewed and approved all study procedures and materials. This study will use a Hybrid Type I (HTI) effectiveness-implementation design to evaluate the effectiveness of WorkChat while collecting data on the initial implementation processes involved in the delivery of WorkChat [27]. https://clinicaltrials.gov/ct2/show/NCT05565482.
The scientific team from the University of Michigan collaborated with SIMmersion on designing the initial protocol using the first author's prior HTI trials in mental health agencies and prisons [28,29]. After funding was obtained, the protocol was reviewed and refined using feedback from a series of community, diversity, and scientific advisory boards, which included autistic stakeholders (see 3.3). Study specific aims are:
Aim 1. Evaluate whether MCTI post-secondary pre-employment transition services-as-usual in combination with WorkChat (Pre-ETS + WorkChat), compared to MCTI post-secondary pre-employment transition services-as-usual in combination with an attention control (Pre-ETS + AC), enhances sustained employment for autistic transition-age youth.
Aim 1 Hypotheses. At the individual level, we hypothesize (H) that Pre-ETS + WorkChat trainees, compared to Pre-ETS + AC trainees, will have improved (H1) social cognitive ability and (H2) work-based social ability; as well as (H3) reduced anxiety about social encounters at work, and (H4) greater sustained employment by 9-month follow-up. Our system-level analyses will be descriptive so there are no hypotheses.
Aim 2. Evaluate mechanisms of sustained employment.
Aim 2 Hypotheses. We hypothesize that social cognitive ability (H5a) and work-based social ability (H5b) will mediate the effect of treatment (Pre-ETS + WorkChat vs. Pre-ETS + AC) on sustained employment.
Aim 3. Conduct a multilevel, mixed-method process evaluation of WorkChat's acceptability, usability, appropriateness, feasibility, fidelity, and initial determinants of implementation (e.g., barriers and facilitators). We will use surveys and semi-structured interviews (among autistic youth, career readiness counselors, and administrators) to identify facilitators and barriers to WorkChat implementation in post-secondary Pre-ETS and describe the process of implementation in this context.
2.2. Eligibility criteria
Participants will be recruited from all available autistic transition-age youth enrolled in the residential Pre-ETS program at MCTI. Study inclusion criteria include: 1) be 18–26 years old; 2) have been screened for autism via MCTI educational record or a parent report using the Social Responsivity Scale 2nd Edition [30,31] (a score of 65T or higher will be used to reduce false positives), and 3) have at least a 4th grade reading level as measured by the Wide Range Achievement Test 5th Edition [32]. Participants will be excluded for: medical illness that may significantly compromise cognition (e.g., moderate or greater traumatic brain injury), uncorrected vision, or hearing problems that prevent using the software.
2.3. Recruitment and screening
A targeted sample of n = 338 autistic transition-age youth will be enrolled and randomized during an 18-month recruitment window from MCTI. Recruitment and screening will involve four steps. In step 1, MCTI staff will identify all incoming students with an educational record of autism and mail them study recruitment materials as part of their MCTI onboarding packet. In step 2, MCTI staff will organize a recruitment meeting of all eligible participants where research team members will introduce the study, discuss the value of being research participants, and answer questions about study participation. In step 3, the potential participants will approach the research team after step 2 is completed to volunteer for the study. In step 4, the research team will review the consent form with potential participants and then obtain their informed consent. The PI will discuss the recruitment procedures and enrollment milestones at weekly meetings with MCTI staff during the recruitment window. The PI will also discuss unforeseen barriers to recruitment and collaborate on developing subsequent strategies to overcome these barriers to then implement to enhance recruitment and enrollment.
Regarding the potential sampling frame, pre-covid-19 pandemic, MCTI enrolled approximately n = 1000 students every 12 months (the enrollment duration of the study) across all of their programming and approximately 25% of the students had autism spectrum disorder. Thus, we anticipate nearly n = 375 autisitc youth will be enrolled at MCTI over a period of 18 months (the study enrollment window). We will attempt to enroll all 375 autistic youth in the study. Assuming 10% of these students will decline participation, we would randomize n = 338 autistic youth over the course of six MCTI semesters (or approximately n = 56 per semester). Assuming a 10% attrition rate, we will complete follow-up data collection on n = 304 students. Moreover, approximately 70% of autistic MCTI students from 2021 to 2022 obtained employment within 3 months of graduation, with 87% of autistic MCTI graduates obtaining employment within 12 months. Given the follow-up time frame of nine months, we powered our sample using the 70% employment rate. Thus, we anticipate n = 212 participants will obtain employment (n = 106 per group), which we used to power our analyses.
2.4. Randomization and masking
The project manager, supervised by the study statistician, will conduct the randomization using a Web-based system with a 1:1 randomization ratio. Participants and MCTI administrators and staff (known as career readiness counselors) will be notified of participants assigned to each group, but research assessors (i.e., University of Michigan research assistants and MCTI trade instructors) will be masked to treatment assignment as will research team members conducting statistical analyses.
2.5. Study interventions
2.5.1. Michigan Career and Technical Institute
MCTI is a Michigan-based, residential, post-secondary technical and vocational training center for young adults with disabilities. MCTI offers a comprehensive suite of services that include medical and counseling, occupational, interpreters for Deaf students, classroom and job accommodations, and job placement services. These services support students engaged in a 10-week technical training semester encompassing trades: Automotive Technology; Cabinetmaking/Millwork; Certified Nursing Assistant; Commercial Printing Program; Construction Program; Culinary Arts Program; Custodial Program; Grounds Maintenance/Landscaping Program; Industrial Electronics Program; Information Technology Program; Machine Technology Program; Pharmacy Technician Program; and Retail Marketing Program. Each trade program combines classroom-based pre-employment, technical, and independent living skills training with a hands-on learning-by-doing approach. This method is facilitated by Trade Instructors during simulated work settings. Currently, MCTI provides minimal instruction to facilitate social cognitive ability or work-based social ability. See supplementary material for a list of assessments and training areas for each MCTI trade.
2.5.2. Attention control
In addition to MCTI services, the Pre-ETS + AC group will receive 18 h of free online educational programming over six weeks featuring: 1) keyboarding (Sense-Lang.org); 2) interactive quizzes, puzzles, and trivia using www.sporcle.com; and 3) current events (www.n2y.com/software). These programs have been previously used as attention control interventions [33,34].
2.5.3. WorkChat: A virtual workday
WorkChat is a multi-tiered learning system that focuses on learning core social skills for work (Part 1), practicing the application of these core social skills in single, stand-alone conversations (Part 2), and practicing the application of these core social skills across connected conversations while performing job duties in a virtual workday (Part 3). A central figure throughout the WorkChat experience is Sage, the job coach (Fig. 1). Sage serves as the social story-teller who provides guidance and gives feedback through each part of the WorkChat experience. The WorkChat development team consisted of the SIMmersion, LLC team (www.simmersion.com), the University of Michigan team (MJS, KS, EW, BR, AS, MH), Michigan State University (CS, SS), University of Pittsburgh (SE), who collaborated with a community advisory board (CB, JB, DK, SM, DT, ST), diversity advisory board (SD, SKK, CL, TL, SM, TW, ER), and scientific advisory board (MB, TD, KH, JS) to iteratively design and develop WorkChat with additional feedback and recommendations provided by 18 autistic transition-age youth and 12 teachers.
Fig. 1.

Sage, the social story job coach.
2.5.3.1. Part 1: core social skills for work
To learn about the core social skills for work, participants will watch a series of brief videos (ranging from one to 8 min each) where Sage teaches them the skills and challenges their knowledge with interactive quizzes. There are seven core skills for social encounters at work: 1) listening to others, 2) understanding others, 3) understanding yourself, 4) responding professionally, 5) understanding complicated language, 6) receiving feedback, and 7) dealing with difficult situations. Following the review of videos associated with each skill, participants will complete a brief, interactive, four-item multiple choice quiz to evaluate their retention of learned content. The Part 1: Core Social Skills for Work interface includes a ‘track your progress’ button where participants can click and see whether they have watched all the videos associated with each skill and completed their quizzes.
The content for the “Core Social Skills for Work” was adapted from Cognitive Enhancement Therapy (CET) [35,36] and Assistive Soft Skills and Employment Training (ASSET) [14,15]. CET was recently validated for autistic adults by SE from our team [35,36] and is an 18-month comprehensive cognitive rehabilitation intervention that integrates 60 h of computer-based training in attention, memory, and problem-solving with 45 social-cognitive group sessions. While the full CET program includes content beyond the scope of WorkChat, the content will draw from the demonstrably effective social-cognitive components of CET to improve knowledge and skill in emotion recognition (e.g., recognition of non-verbal cues and emotions in faces), theory-of-mind (e.g., knowing others’ thoughts and beliefs), and emotion regulation (e.g., emotional temperature-taking). ASSET was adapted by CS from our team from the United States Department of Labor Office of Disability Employment Policy curriculum “Skills to Pay the Bills: Mastering Soft Skills for Workplace Success” [37]. It includes twelve 90-min in-person group sessions to remediate the application of emotion recognition, theory-of-mind, and emotion regulation via didactic content review and practice of six workplace social skills: communication, enthusiasm and attitude, teamwork, networking, problem-solving and critical thinking, and professionalism [14,15]. The identification of the core skills and their content was reviewed by the three advisory boards (community, scientific, diversity) for appropriateness and relevance for both the autistic community and for inclusion in transition services, as well as for inclusiveness with respect to diversity and equity among historically marginalized groups (e.g., by race, gender).
2.5.3.2. Part 2: practice conversations
The autistic youth will practice independent workplace conversations with a fictional customer, coworker, or supervisor where they will receive a numerical score (0–100) after the conversation that represents how well they applied their core skills (from Part 1). Prior to beginning these practice conversations, participants will review a series of nonverbal cues that Sage will display during the conversation to provide real-time feedback on how well the conversation is going. After learning the cues, participants will re-review a subset of their core skills with Sage via brief videos that emphasize which specific skills will be scored (and given feedback) for each of the conversation levels (easy, medium, and “hard” [to avoid bias we reference the ‘hard’ skills as ‘virtual workday skills’]).
Specifically, the brief videos will cover the skills “focusing on others,” “responding to others,” and “handling challenges” that they are expected to perform during each conversation. To reduce cognitive load after completing the ‘easy’ conversations, feedback is only given for “focusing on others.” In turn, the feedback is scaffolded so that conversations at ‘medium’ difficulty elicit feedback on “focusing on others” and “responding to others.” Trainees are then informed that the Virtual Workday conversation feedback for “focusing on others,” “responding to others,” and “handling challenges” is only provided during Part 3: Virtual Workday. Once participants review how the conversations will be scored, they will begin practicing their ‘easy’ and ‘medium’ conversations with the customers, coworker, and supervisor.
The practice conversations will be driven by the PeopleSim conversation engine technology (Fig. 2 for example interface) and provide opportunities to apply the core social skills knowledge that participants will learn in Part 1. Each conversation will enable participants to apply the content that they learned from Part 1 by selecting appropriate statements and responses while talking with the customer, coworker, or supervisor. As noted above, the Part 2 conversations are only with characters at the ‘easy’ or ‘medium’ level of difficulty. Meanwhile, characters with the ‘hard’ level of difficulty will only be present during the virtual workday. See Table 1 for a matrix of characters by personality/mood and difficulty level.
Fig. 2.
PeopleSIM interface and WorkChat characters.
Panel A displays the supervisor, and to the right, the conversation interface. Panel B displays the coworker (purple shirt) and the customers.
Table 1.
Character matrix.
| Conversation | Easy Level of Play | Medium Level of Play | Hard Level of Play |
|---|---|---|---|
| Customer Conversation | Calm | Irritated | Angry or Chatty |
| Coworker Conversation | Pleasant | Entitled | Manipulative or Poor Boundaries |
| Supervisor Conversation | Easygoing | Strict | Busy or Standoffish |
There will be a total of six customer scenarios that present in Part 2. These conversations will reflect the customer: 1) can't find the toilet paper and needs help, 2) needs help returning a watch, 3) needs a refund for a shirt, 4) needs assistance carrying a television out of the store, 5) reporting a spill that needs to be cleaned up, and 6) may be allergic to cleaning chemicals in a nearby bucket. In addition, there is only one customer actor (an Asian woman) to talk with in Part 2. Meanwhile, there are two additional customer actors (a Black man and a White woman) during the virtual workday in Part 3. Customer conversations will last 3–5 min and will include enough variations and randomness for participants to play 10–12 times at each level (easy, medium, hard) with each of the six scenarios before the conversations feel repetitive. Notably, our implementation plan (see 2.5.4.2 below) proposes that participants will only complete three conversations at each level of difficulty (with hard conversations only occurring during Part 3: Virtual Workday).
Although only one actor portrays the coworker (a White woman) and supervisor (a Black man), the characters in Part 2 (and Part 3) are driven by an emotional model, their personality, and the context within the virtual workday (e.g., break with a coworker, feedback session with supervisor). In addition, participants will potentially encounter three different moods (a neutral mood, a frustrated mood, and an engaged mood) of each character based on their decisions during the virtual workday.
The coworker conversation opens where the participant says hello and is expected to self-select from several conversational topics (e.g., movies, gossiping about the supervisor or customers). Coworker conversations can last anywhere from 5 to 10 min and will automatically end prior to 15 min since the work break only lasts 15 min. Meanwhile, the supervisor conversation during Part 2 randomly selects various issues (e.g., not completing your tasks, being rude to customers) that may come up during the workday and the supervisor will give you feedback on those issues. This conversation will last approximately 15 min.
The virtual characters will be portrayed in HD video by professional actors to provide a very high degree of social presence [38]. The actors vary in terms of their age, gender, physical appearance, and race to provide participants a diverse range of conversational partners. Moreover, each virtual character will feature several personalities that vary between training sessions and a dynamic emotional model driven by the conversational selections made by the study participants. This approach allows participants the opportunity to apply their new core social skill knowledge in a broad range of situations.
Both during and after the conversations with the customer, coworker, and supervisor characters, participants can review transcripts of the interaction by replaying the entire conversation or individual exchanges with the characters, including a replay of the participant's voice as captured by speech recognition. Hearing as opposed to simply reading the transcript allows trainees to reflect on tone and other variations in voice they may have missed or misjudged. Trainees can also click on interactive sections in the written transcript to receive specific feedback on how their responses impacted their conversation and overall score, and how the characters perceived their responses during the conversation; including how their responses shaped the choice of how characters subsequently responded to them. For example, when the coworker shares that she likes to watch classic movies, a participant may respond with “Old movies? Seriously? I thought you were cooler than that.” Then their feedback might read “Using judgmental and impolite language to comment on the coworker's hobbies may cause her to feel uncomfortable and will likely make her hesitate to share personal details with you in the future. Next time, try to respect her hobbies by asking her more about them.” Lastly, the feedback in the transcript is color-coded where green segments of text reflect appropriate or useful responses, red text indicates inappropriate or unconstructive responses, and black coded text denotes neutral responses.
After completing simulated conversations (with customers, coworkers, or supervisors) on the ‘easy’ difficulty level, trainees receive a numerical score and summary feedback on how well they performed their core skills “focusing on others” and “responding to others.” The numerical scores are based on an algorithm that tracks trainees' performance throughout the conversations. As the level of play increases, the scoring becomes more detailed. The qualitative performance feedback provided as a part of the summary that accompanies the score assists trainees in decoding the subtleties of the work-based conversations.
Notably, each core skill consists of subskills. First, “focusing on others” includes “active listening” and “being professional and polite.” Second, “responding to others” includes “being supportive” and “focusing on solutions.” Third, “handling challenges” includes “understanding complicated language” and “handling challenging personalities.” Specific to the supervisor conversation, “handling challenges” also includes “receiving feedback.”
2.5.3.3. Part 3: virtual workday
The Virtual Workday is an innovative, gamified design where trainees first engage in a video with Sage where she contextualizes the goals of the virtual workday (i.e., complete your tasks (e.g., helping at the café, welcome customers), help customers, take your break to chat with a coworker, and attend a supervisor feedback session. Second, participants select the level of their virtual workday experience (Level 1: complete two tasks; Level 2: complete two tasks and talk with both a customer and coworker; Level 3: complete three tasks and talk with two customers, a coworker, and a supervisor; Level 4: complete all five tasks, and talk with three customers, a coworker, and supervisor). Third, participants then begin their workday as they are transported to a first-person view set inside a fictional big box store called “Wondersmart” (Fig. 3). Here the participants will engage in a gamified experience where they navigate through the virtual store to complete their tasks, address several unexpected ‘what would you do events’ (e.g., you find a wallet on the ground, what do you do next), support customers, chat with a coworker during break, and receive feedback from the supervisor on how well they performed during the virtual workday.
Fig. 3.
Virtual workday at wondersmart.
During the virtual workday, the conversations with customers, coworkers, and supervisors will use the same conversation interface that is present in Part 2: Practice Conversations. The primary difference is that although participants can access their transcript during the virtual workday conversations, they cannot go back to them once the conversation transitions back to the virtual workday. However, participants can access their transcripts at the end of the workday after completing their supervisor conversation. At this point, all feedback will be available, including a single score based on one's performance across all tasks and conversations.
A major innovation in the PeopleSIM technology that is embedded in the virtual workday is that the customer and coworker conversations are connected with the supervisor conversation. Thus, the supervisor will be aware of whether trainees completed all their tasks, and the supervisor will be aware of how well the trainee engaged in the customer and coworker conversations. As a result, the supervisor can give the trainee praise for being successful in how they talked with the customer. Alternatively, the supervisor could offer the trainee additional training if the trainee makes a number of mistakes while talking with the customers. Moreover, the supervisor will be aware if a trainee has unprofessional (e.g., bad mouth the supervisor) or inappropriate (e.g., make comments about her appearance) conversation with the coworker. In these cases, the supervisor will give the trainee feedback and offer additional training.
The WorkChat design embeds behavioral learning principals such as the facilitation of repeated practice to provide an infrastructure for participants to improve their conversational abilities [39,40]. WorkChat also follows the principles for designing effective simulations [41], which supports the facilitation of sustained behavioral change [42,43]. The experience of handling upset customers or getting feedback from a supervisor can be distressing for any adult and WorkChat simulates these experiences (and others) to provide opportunities to engage in these challenging conversations in a judgement-free environment.
2.5.3.4. Nonverbal coach
Study participants will receive real-time feedback during the conversations with customers, coworkers, and the supervisor from their job coach Sage, who displays nonverbal cues reflecting the strength and appropriateness of the participant's responses (Fig. 1). The use of this nonverbal coach was identified by autistic transition-age youth as one of the most important components of the PeopleSIM interface during our prior development of Virtual Interview Training for Transition Age Youth [16]. Along with the real-time feedback, a one-to-two-sentence coaching text is provided that helps clarify the non-verbal feedback. For example, if the coach claps for the trainee, the coaching text might say something like, “What you said is helpful and polite.” If the coach gives the trainee a thumbs down, the coaching text might say, “What you said may come across as unhelpful and rude, which will likely upset the customer.” If the trainee would like these coaching texts read to them, there is a speaker button the trainee can click which will provide an audio version of the coaching text.
2.5.4. WorkChat implementation at MCTI
2.5.4.1. WorkChat implementation training and fidelity assessment
The career readiness counselors will attend a 1-h WorkChat orientation led by the PI on how to use the tool. Next, the career readiness counselors will spend 60–90 min using WorkChat themselves to practice navigating Part 1: Core Social Skills for Work; Part 2: Practice Conversations; and Part 3: Virtual Workday. Specifically, this practice will require the career readiness counselors to click through at least 1 core skill in Part 1, complete at least one conversation with each character available in Part 2, and complete at least one Level 4 Virtual Workday. This approach will enable the counselors to become more intimately familiar with how participants will experience WorkChat.
The PI designed a self-monitoring fidelity checklist (Appendix A) to promote a high level of fidelity when teaching each step of WorkChat. The career readiness counselors will use the fidelity checklist during the final stage of training where they role play teaching how to use WorkChat. The role plays will be supervised by the local MCTI administrator and/or the research team members, who will then provide feedback on performance. The career readiness counselors will use the fidelity checklists to facilitate teaching study participants how to navigate the WorkChat interfaces and Virtual Workday. They will complete a fidelity checklist to ensure that each area of WorkChat is being covered. The research team will collect and review the fidelity checklist for every cohort of participants. Orientation sessions where career readiness counselors teach the tool will also be audio recorded and randomly selected for independent review by research staff. If it is discovered during these reviews that a career readiness counselor does not obtain at least 90% fidelity on the checklist during a session, then that person will be required to undergo refresher training on the implementation process. Refresher training will consist of research staff and the career readiness counselor meeting in person to review the script for delivery and participating in a peer-to-peer orientation until fidelity is 100% attained. Once this refresher is completed, the career readiness counselor will be approved to lead future orientation sessions.
2.5.4.2. WorkChat implementation plan
The research team developed a set of guidelines for implementing WorkChat with study participants. This set of guidelines was submitted to the WorkChat development team and advisory boards. A set of adjustments to the plan were recommended, reviewed, and updated by the research team. The final recommended implementation plan can be found in Appendix B.
The participants will primarily use an existing computer lab on the MCTI campus where the devices are all running the latest version of Google Chrome as WorkChat was optimized for Google Chrome delivery (but Firefox and Edge are capable browsers as well). In addition, the WorkChat implementation plan was intentionally designed to allow participants to engage with WorkChat independently on their MCTI devices as each student receives a Chromebook device to use for completing their course work. Prior to WorkChat implementation, the career readiness instructors will introduce study participants to the tool, while completing a fidelity checklist that verifies that they have instructed the trainees on how to use the intervention. This checklist will serve as written assurance that WorkChat was delivered with high fidelity. The research team (either PI or project manager) will review the fidelity checklists to verify that delivery procedures are regularly observed and provide bimonthly supervision to monitor ongoing fidelity.
Based on the results from multiple pragmatic effectiveness and concurrent implementation evaluations of virtual interview training interventions that use SIMmersion's PeopleSIM simulation technology in school-based employment readiness programs [[17], [18], [19],44,45], we designed the implementation plan to offer a structured and scaffolded, yet flexible, approach for participants to work their way through each part of WorkChat (see Appendix B). Specifically, Week 1 (three sessions) will be dedicated to introducing Part 1: Core Social Skills for Work where participants will learn about the skills needed to have successful conversations in the workplace. Then in Weeks 2–3 (three sessions per week), participants will begin Part 2: Practice Conversations to practice applying their new core skills during a series of independent conversations with customers, the coworker, and the supervisor, which will be scaffolded from easy to medium difficulty for each character type. During Weeks 4–5 (three sessions per week), participants will begin Part 3: Virtual Workday where they will enter the Wondersmart store and develop their skills at completing a series of tasks (e.g., stocking shelves, helping in cafe) that are interrupted by customer conversation requests and ‘what would you do’ decision-making events (e.g., find a wallet on the floor, see a mouse) that are followed by a coworker conversation during a break and that ends with receiving feedback from the supervisor on how the whole series of events played out. The virtual workday will also be scaffolded to include more conversations and tasks as the levels increase.
In addition to the existing structure and scaffolding of WorkChat, the implementation plan is designed so that trainees are encouraged to first try their best to earn rewarding feedback from Sage (the coach) and improve their performance (reflected by the numerical scores). Then the implementation plan provides protected time for participants to have fun and intentionally say the wrong things on purpose during their conversations with customers, coworkers, and supervisors (during Part 2 and Part 3). For example, participants will have the option to ‘tell off’ a rude customer and then they will learn how the customer responds to this choice during Part 2. However, when they ‘tell off’ the customer during Part 3, they will experience the customer reaction and receive feedback from their supervisor who ‘observed’ their choice to ‘tell off’ a rude customer.
Although our collaborative team designed WorkChat to be self-guided, we anticipate that the career readiness instructor could spend 10–15 min reviewing the transcripts, scores, or feedback from the conversations with participants. The research team will assess participant adherence to WorkChat by monitoring: 1) progress on watching all the core skill videos, 2) practicing conversations at ‘easy’ and ‘medium’ with each character, and 3) progress through the four levels of the virtual workday. The research team will review and discuss WorkChat adherence (and challenges to adherence) during weekly team meetings. Lastly, given that the participants live, share meals, and take classes together, the study has a potential threat to internal validity as the Pre-ETS + WorkChat group may share conversation strategies, learned from WorkChat, with the Pre-ETS + AC group. That said, the core training of WorkChat is the opportunity to repeatedly practice conversations at work which is unavailable to the Pre-ETS + AC group.
2.6. Follow-up and retention strategies
The PI or study team will host a recruitment orientation session with each cohort to introduce them to the study and study team members. They will also discuss why conducting randomized controlled trials are essential to helping the field understand whether or not WorkChat is associated with improvements in sustaining employment. The project manager will communicate with teaching staff at MCTI regarding the enrollment targets and will maintain a database of the participants and their completion milestones throughout the study. Research staff and MCTI career readiness counselors will encourage participants to complete their research assessments while residing on the MCTI campus and will help facilitate structured times to complete study visits.
During enrollment, the consenting process will facilitate research team access to personal contact information such as phone number and email address for the participants and both of their parents (or guardians). In addition, during the 9-month follow-up period, the project manager will maintain contact with study participants through reminder calls and mailed cards sent on birthdays and holidays (beyond the winter holidays). These methods will optimize retention. In addition, MCTI staff will assist with the facilitation of follow-up data collection as that role is embedded within standard MCTI protocol for a period of 12 months after graduation.
The research team will attempt to contact study participants for a period of four weeks beginning on the follow-up visit due date. After 4 weeks, the participant will be considered lost to follow-up, and the research team will obtain follow-up employment outcomes via MCTI administrative records, which was authorized as part of the informed consent. To help facilitate retention, the research team will send intermittent (every 4–6 weeks) check-in correspondence to participants via mail, email, or text message between their post-test and 9-month follow-up data collection.
2.7. Study measures
We present the study measures as originally proposed in the grant application. Notably, some of these measure were changed based on feedback from advisory boards or the community partner (see 3.3).
2.7.1. Background measures
All participants will complete a survey regarding the demographic characteristics (e.g., age, race, ethnicity) and employment history. In addition, we will assess fluid and crystalized intelligence, cognitive flexibility, language, processing speed, attention, executive function, and working memory via participant completion of the cognitive core of the NIH Tool Box [46]. In addition, participants will complete the 8-item version of the Patient Health Questionnaire as an assessment of their current depressive symptoms [47]. Parents will complete the 118-item adult behavior checklist to assess participants’ externalizing behavioral challenges [48].
2.7.2. Study outcomes
Hypothesis 1 outcomes are: 1) emotion recognition; 2) theory-of-mind; and 3) emotion regulation. To assess emotion recognition, we will use the Penn Emotion Recognition Task [49], which includes 96 items evaluating one's perception of displayed emotions on faces. To assess theory-of-mind, we will use the Hinting Task [50], which includes 10 vignettes of two characters interacting which evaluates one's ability to infer the intentions of one of the characters. Both measures have been validated for autistic transition-age youth [51]. To assess emotion regulation, MCTI trade instructors (who spend approximately 30 h per week with students) will complete the brief version of the Emotion Dysregulation Inventory (EDI). The EDI was developed specifically for autistic youth and adults and includes 7 items assessing reactivity and 6 items assessing dysphoria [52,53]. Based on our prior research [36], we will evaluate the assessments individually as well as create a composite score across measures.
Our hypothesis 2 outcome is work-based social ability. To assess this outcome, MCTI trade instructors will complete the Work Behavior Inventory (WBI). The WBI rates social skills (7 items), cooperativeness (7 items), work habits (7 items), work quality (7 items), and personal presentation (7 items) [54].
Our hypothesis 3 outcome is anxiety about work-based social encounters. We will measure this outcome using the Social Interaction Anxiety Scale (SIAS) self-report [55], which includes 20 items assessing fear of social conversations and emotions associated with this fear. The SIAS has acceptable psychometric properties when completed by autistic transition-age youth [56].
Our hypothesis 4 outcomes are focused on employment. Employment history since leaving MCTI will be assessed via a brief interview at 9-month follow-up. Specifically, we will ask participants to provide details on their type of job, wage, hours worked per week, days worked per week, and job tenure (start and end dates). We will also ask participants to provide this information for all jobs worked during this follow-up period. We will code job types using the Dictionary of Occupational Titles [57]. We will use the Substance Abuse and Mental Health Services Administration definition of competitive employment [58]. Specifically, a competitive job pays at least minimum wage, occurs in an integrated community setting, and is not set aside for persons with disabilities.
2.7.3. Potential mechanisms of employment
We will test the mechanistic hypotheses that social cognitive ability (H5a) and work-based social ability (H5b) will mediate the effect of treatment (Pre-ETS + WorkChat vs. Pre-ETS + attention control) on sustained employment. The social cognition and social ability measures have been previously described.
2.7.4. Implementation and process measures
This is a Hybrid Type 1 effectiveness-implementation study involving a multilevel (students, career readiness counselors), mixed-method evaluation of antecedent and perceptual variables (acceptability, appropriateness, and feasibility) related to initial WorkChat implementation [59,60].
MCTI administrators and career readiness counselors will self-report via surveys: 1) pre-implementation acceptability of WorkChat as an intervention; 2) acceptability of their orientation on how to deliver WorkChat; 3) appropriateness of WorkChat; 4) expected feasibility of WorkChat delivery; and 5) post-implementation evaluation of WorkChat acceptability (career readiness counselors) and potential for sustainability beyond the study period. These measures were based on the Weiner et al. (2017) intervention measures [61] and the Proctor et al. (2011) implementation research outcome domains taxonomy [62] and the same measures have demonstrated strong reliability when used to evaluate virtual job interview training delivered in special education programs [18,19]. Additionally, career readiness counselors will complete surveys evaluating the context and adaptation to the collaboratively developed initial WorkChat implementation plan (items based on Stirman's FRAME adaptation coding taxonomy) [63,64]. MCTI career readiness counselors and administrators will also complete semi-structured interviews based on the Consolidated Framework for Implementation Research (CFIR) 2.0 guide (www.cfirguide.org) to evaluate determinants of WorkChat implementation to inform strategies used and evaluated in future implementation efforts [65].
Autistic transition-age youth randomized to practice with WorkChat will be recruited to evaluate their perceptions of WorkChat acceptability (items adapted from the Treatment Acceptability Rating Form [66]) and usability (items adapted from the Post-Study System Usability Questionnaire [67]). The acceptability survey will also include open-ended questions to assess their personal reflections of autistic transition-age youth after using WorkChat. Specifically, the questions will ask: 1) What was your favorite thing about WorkChat?, 2) What was your least favorite thing about WorkChat?, and 3) Did you use WorkChat in a different way than you were taught? (If yes, please explain). A summary of the implementation evaluation measures can be found in Table 2.
Table 2.
Summary of the proposed multi-level, mixed methodology for the initial implementation evaluation of WorkChat.
| Process Evaluation Domain | Type of method, source of data, examples |
|
|---|---|---|
| Quantitative Data | Qualitative Data | |
| Acceptability (career readiness counselors [CRCs]) | Training Evaluation Questionnaire; Intervention Acceptability Measure (sample questions: satisfied with the training you received to teach WorkChat?; Overall, how satisfied are you with WorkChat as a service for your students?) | CRC Interviews (example questions: how we can improve the CRC training experience? What did you like the most/least about WorkChat? Can you share your thoughts on what was appealing about teaching students how to use WorkChat?) |
| Acceptability (students) | Treatment Acceptability Rating Form (sample question: WorkChat was enjoyable) | Semi-structured interview for students (sample question: What did you like the most about WorkChat?) |
| Usability (students) | System Usability Scale (sample question: I think I'm good at using the WorkChat program) | Semi-structured interview for students (sample question: What were the technical challenges you experienced when using the WorkChat software?) |
| Appropriateness (CRCs) | Intervention Appropriateness Measure (sample question: How well do you think WorkChat fits with students' goals for job training?) | Semi-structured interviews with CRCs (sample questions: How is WorkChat an appropriate fit for employment services?) |
| Feasibility (CRCs) | Intervention Expected Feasibility Survey (sample question: How well do you think WorkChat fits with students' goals for job training?) | Semi-structured interviews with CRCs (sample questions: What will help you the most when teaching WorkChat to students? |
| Feasibility (CRCs) | We will administratively monitor the time required to train CRCs to deliver WorkChat; fidelity of WorkChat delivery; fidelity checklist completion rates; CRC time allocated to WorkChat implementation | Semi-structured interviews with CRCs (sample questions: What was your experience completing the fidelity checks? What are barriers to WorkChat implementation? What are factors that help to successfully facilitate WorkChat implementation?) |
| Feasibility (student) | We will administratively monitor: adherence to training visits; reasons for missed visits; reasons for skipping different training components | Semi-structured interview for students (sample question: How can we make WorkChat easier to use for students in pre-employment transition services?) |
| Cost (MCTI administrators, University Implementation Team) | We will monitor the time CRCs, MCTI administrators, and University Team members spend on preparing the site to implement WorkChat. | Semi-structured interviews with MCTI administrators (sample question: How will cost be a major factor in continuing to use WorkChat at MCTI?) |
2.8. RCT and implementation evaluation data collection schedule
The data collection schedule for the proposed RCT can be found in Table 3. First, participants will complete all pretest assessments during visit 1 and visit 2, with each visit lasts for approximately 1–1.5 h (T1 data points). Career readiness counselors will deliver WorkChat during visits 3–17. Then participants will complete all post-test assessments during visits 18–19, with each visit lasts for approximately 1 h (T2 data points), followed by a 9-month check-in call (T3 data points). For the initial process evaluation, we will administer surveys to administrators and career readiness instructors immediately following training on how to deliver WorkChat, during and after the implementation of WorkChat for each cohort, and at the end of the study to capture baseline, early, and longer term perceptions of implementation. The semi-structured interviews with career readiness counselors and administrators will occur after each cohort.
Table 3.
Schedule of assessments.
| Study Measures | Instrument | Collection Method | Timing |
||
|---|---|---|---|---|---|
| T1 | T2 | T3 | |||
| Background | |||||
| Demographics | Background survey | Self-Report | X | ||
| Employment history | Employment History Interview | Interview | X | ||
| Neuropsychological functiona | NIH Tool Box - Cognition | Assessment | X | ||
| Depressive symptoms | Patient Health Questionnaire | Self-Report | X | ||
| Behavior challengesb | Adult behavior checklist | Parent report | X | ||
| Primary Outcomes | |||||
| Emotion recognitionc | Penn Emotion Recognition Task | Assessment | X | X | |
| Theory-of-mind | Hinting Task | Assessment | X | X | |
| Managing emotions* | MSCEIT | Assessment | X | X | |
| Emotion regulation | Emotion Dysregulation Inventory | Teacher-report | X | X | |
| Sustained employment | Recent Employment Interview | Interview | X | X | |
| Secondary Outcomes | |||||
| Work-based social ability | Work Behavior Inventory | Teacher-report | X | X | |
| Anxiety about work-based social encounters | Social Interaction Anxiety Scale | Self-Report | X | X | |
| Exploratory Outcomes/Mechanisms | |||||
| Applied social ability* | Social Skills Performance Assessment 2.0 | Assessment | X | X | |
Abbreviation. MSCEIT: Mayer-Salovey-Caruso Emotional Intelligence Test (managing emotions subtest).
*Note: Though it was not an outcome included in the original grant application, this measure was added in response to advisory board review.
aThe cognitive scale of the NIH toolbox was replaced with the Brief Assessment of Cognition.
2.9. Data analyses with power estimates for aim 1 hypotheses
2.10.1 Hypotheses 1–3: Pre-ETS + WorkChat trainees, compared to Pre-ETS + AC trainees, will have stronger (H1) social cognitive ability and (H2) work-based social ability as well as (H3) reduced anxiety about social encounters at work.
To test H1 - H3, we will create a linear mixed model (LMM) for each hypothesis using covariates of time point, treatment group, and their interaction. Linear mixed models allow for correlation among observations on the same person, and allow participants to be included in the analysis if they had data from at least one time point [68]. Power and sample size were calculated by methods recommended by Diggle and colleagues [69], with Sidak's multiple comparison adjustment for three outcomes [70]. An effect size of 0.4 can be detected with over 80% power, while an effect size of 0.3 corresponds to 74% power. From the LMM, differences between treatment groups from pre-test to post-test will be estimated by post-hoc contrasts.
2.9.1. Hypothesis 4: Pre-ETS + WorkChat trainees, compared to Pre-ETS + AC trainees, may have greater sustained employment (average weekly hours worked) by 9-month follow-up
To test H4, the sample size was calculated to detect a 0.4 effect increase in average weekly hours worked at 80% power, based on a two-sample t-test. With 106 in each group, power will be at least 80%. If any key covariates differ significantly at baseline, the difference in hours worked will be tested by a linear regression model with covariates of intervention group and any key baseline variables that differ by intervention group, such as age, sex, depression, anxiety, and intelligence scores. A sample size of 212 will provide over 90% power to detect a 5% increase in R2, assuming covariates of intervention group status and up to 5 baseline characteristics.
2.10. Data analyses with power estimates for aim 2 hypotheses
2.10.1. Hypothesis 5 (H5): Social cognitive ability (H5a) and work-based social ability (H5b) will mediate the effect of treatment (Pre-ETS + WorkChat vs. Pre-ETS + AC) on sustained employment (weekly hours worked)
To test H5, we will use Kraemer's mediator analytic framework for RCTs [71,72]. We will test first for a significant Pre-ETS + WorkChat effect on a composite social cognition score (mediator) compared to Pre-ETS + AC, then check for treatment by mediator interaction [71], then on the product of the two coefficients [73] with bootstrapped confidence intervals [74]. We will use the same approach to evaluate work-based social ability as the mediator. We computed power using a sequence of three multiple regression models, assuming moderate effect sizes for R2's for each of the three equations of 0.40, detecting a change in R2 of 0.05, and using the Sidak [70] adjustment for multiple comparisons. Thus, power for the mediation model is above 90%.
2.11. Exploratory analyses
We will conduct independent t-tests to explore if Pre-ETS + WorkChat and Pre-ETS + AC differ with respect to: 1) number of days worked at their first job post-MCTI, 2) total days worked at all jobs post-MCTI, and 3) social encounters at each job post-MCTI. There is no preliminary data available (from our studies or others) to power these tests.
2.12. Data analyses for aim 3 multilevel, mixed-method process evaluation of WorkChat's acceptability, usability, appropriateness, feasibility, fidelity, and initial determinants of implementation (e.g., barriers and facilitators)
2.12.1. Mixed-methods quantitative analyses
We will report the descriptive statistics (i.e., mean, standard deviation, range) of WorkChat's pre-implementation acceptability and appropriateness for employment services (administrators, career readiness counselors); acceptability of training methods to prepare WorkChat implementers (administrators, career readiness counselors); WorkChat expected delivery feasibility (administrators, career readiness counselors); implementation context and adaptation (career readiness counselors only); post-implementation WorkChat feasibility and sustainability (administrators, career readiness counselors), acceptability (administrators, career readiness counselors, autistic transition-age youth), and usability (autistic transition-age youth only). In addition, we will evaluate whether there were differences between implementation context and adaptation at delivery midpoint and endpoint using paired samples t-tests.
2.12.2. Mixed-methods qualitative analyses for administrators, career readiness counselors
Our research team will transcribe the open-ended semi-structured interview data from administrators and career readiness counselors verbatim to prepare for data analysis. Our team will analyze the data iteratively using thematic analysis and the constant comparative approach [75,76] to identify emergent themes regarding the acceptibility, feasibility, and barriers and facilitators of implementing WorkChat within Pre-ETS. Barriers and facilitators will be coded using a codebook based on the CFIR 2.0 guide [65]. Two research staff will analyze the data using Ethnograph, a qualitative data analysis package. The initial phase of analysis will consist of the research staff independently analyzing a subset of transcripts to iteratively develop new codes as they emerge, using an inductive qualitative approach. Additionally the team will use a deductive approach to identify codes based on topics covered in the CFIR Codebook and on the definitions of implementation outcomes as defined by Proctor et al. [62]. Once we have developed a set of final codes and inter-coder reliability using Cohen's Kappa [77] in the initial phase of analyses, the codes will be used to analyze the remaining transcripts [78,79]. Framework analysis, which allows for the use of both inductively and deductively developed themes in qualitative analysis, will be used to evaluate career readiness counselors' and administrators' perceptions of WorkChat as well as acceptibility, feasibility, and barriers and facilitators of WorkChat implementation [80]. To facilitate comparison, a matrix of the identified themes will be developed that will include participant type (x-axis) vs. barriers and facilitators (y-axis). Matrices will aid in identifying themes common to all groups and features specific to particular subgroups (e.g., social identities), should they be present in the data [81]. For example, the experience of implementing WorkChat for autistic transition-age youth may be related to organizational context factors not evident among career readiness counselors or administrators. Thus, administrators might perceive the determinants of implementation (i.e., barriers and facilitators) differently due to barriers that career readiness counselors and autistic transition-age youth do not report.
2.12.3. Mixed-methods qualitative analyses for autistic transition-age youth
We will transcribe open-ended interview data verbatim and analyze the data iteratively using thematic analysis and the constant comparative approach [75,76] to identify emergent themes regarding the barriers to and facilitators of implementation of WorkChat. Research staff will analyze the data using Ethnograph, a qualitative data analysis package. Following the same analysis plan previously outlined for administrators and counselors' data, research staff will independently analyze a subset of transcripts to iteratively develop codes inductively as they emerge, as well as deductively based on initial topics (e.g., barriers, available resources). After the team agrees on a final codebook and intercoder reliability using Cohens Kappa is achieved, the codes will be used to analyze all remaining transcripts [78,79]. We will use framework analysis to compare autistic transition-age youth participants’ perceptions of barriers to and facilitators of WorkChat implementation [80]. To facilitate comparison, a matrix of themes will be developed: subject type (x-axis) vs. barriers and facilitators (y-axis). Matrices will identify themes common to all groups, and themes and features specific to particular subgroups [81]. Overall, this evaluation will inform WorkChat feasibility, acceptability, and determinants of implementation.
2.13. Handling missing data
Prior to conducting our data analyses, we will use standard double data entry for data cleaning and quality control. We will also use standard techniques to create and psychometrically assess validated scales, test for multicollinearity, and conduct a missing data review. If indicated, missing data may be addressed by use of conditional variables, maximum likelihood estimates, sensitivity analyses, multiple imputation, or variable exclusion [82].
2.14. Quality assurance and quality control
2.14.1. Data management at the University of Michigan
The University of Michigan Institutional Board (IRB) will review and approve all recruitment strategies, documents, and scripts. These recruitment methods will be reviewed after each MCTI recruitment session to assess if the approved strategies are successful at enrolling participants. If enrollment is lacking or enrollment numbers are too low, new strategies will be developed and submitted for review. The study coordinator will evaluate all enrolling participants to confirm they meet the specified eligibility criteria and will also continuously check all incoming study data and files to confirm accurate completion of data collection measures. If the study coordinator identifies missing data, a plan to attempt to obtain the missing data will be created. Study data will be entered into an online database known as Research Electronic Data Capture (REDCap), a password-protected system accessed through a virtual private network and behind firewalls [75]). To protect participant confidentiality, enrolled participants will be assigned a personal identification number (PIN). Only certain members of the research study team will have access to the REDCap form that links participant names to their PIN. If paper copies are used, they will be stored in a locked filing cabinet in a locked office at the University of Michigan School of Social Work. Physical video and audio recordings will also be stored in the same place. Virtual video and audio recordings will be stored on a password-protected and encrypted University of Michigan device, along with other computer data files. The study coordinator and statistician will review data collection procedures and methods on a regular basis to monitor reliability and validity of the research data.
2.14.2. Data management at MCTI
All study items that research team members will bring into MCTI (e.g., video cameras, tripods, headsets, audio recorders, research data lockboxes, and all paper-based data collection materials) will be reviewed and approved by MCTI officials prior to initiating the study.
Regarding arrival procedures, research team members will check in at the MCTI front desk and obtain a volunteer pass that will allow them to enter the school. In order to get their volunteer pass, research team members will need to present picture identification. After research team members obtain their visitor passes, they will proceed into the school and be escorted by school staff.
2.14.3. Data collection fidelity
Conducting randomized controlled trials in school settings requires the research team to maintain a high level of fidelity with regard to data collection procedures. We will implement the following best practices for training research staff. Research staff will receive training from the study coordinator who has expertise conducting research in school settings. The training will consist of first attending an orientation to review all study procedures. After attending an orientation, a research team member will observe the study coordinator conduct a complete mock study visit where they will consent a mock participant and administer assessments. Next, the study team member will conduct at least two mock study visits independently with the study coordinator observing and providing feedback. Once the study coordinator determines the research team member is ready to conduct a study visit on their own, the research team member will conduct a final visit that will be recorded or viewed live by the project manager or PI. The project manager or PI will review the live session (or recording) and assess the research team member's performance and, if sufficient, grant them authorization to conduct research visits. If the research team member requires additional training, they will conduct subsequent mock visits with the PI or project manager until their performance is deemed satisfactory. Then, the PI will authorize the research team member to work with participants.
Role-play training will be monitored by the PI or project manager, who has expertise in training and supervising study team members to successfully carry out role-play assessments [[83], [84], [85], [86], [87]]. Study team members will read through the role-play training script, have the opportunity to ask questions, and will then practice administering the role play. During these practice sessions, the PI or project manager will provide feedback, and then assign the study team members to conduct the role play with an additional study team member three times. The PI or project manager will again provide feedback after observing these performances, and then give the final authorization for the research staff to conduct role-plays with participants.
3. Discussion
Given that the social abilities of autistic transition-age youth are stigmatized by an allistic world [[4], [5], [6]], autism services research needs to support autistic transition-age youth by both developing programs to educate and train employers and staff to be more accepting and affirming of autism as well as helping enhance the social abilities of autistic transition-age youth. We acknowledge that WorkChat targets the latter. For the past two years, we partnered with the autism community and a series of advisory boards to develop WorkChat: A Virtual Workday that we will evaluate within the context of post-secondary pre-employment transition services.
WorkChat was developed to help enhance the role-play training that teachers are already doing to support the social development of their autistic students [88,89]. In particular, the role-play training that teachers provide is not standardized as the field does not currently have any evidence-based methods to help autistic transition-age youth have more effective conversations with customers, coworkers, and supervisors [90]. Role plays are also resource-intensive so autistic transition-age youth seldom engage in the repeated practice that is necessary to facilitate behavioral change [88]. In addition, WorkChat is facilitated via a virtual learning environment that creates a safe and non-threatening space for youth to take risks in how they practice their conversations at work without fear of being judged [88,[91], [92], [93]]. Moreover, a recent meta-analysis concluded that computerized trainings effectively teach general social skills to autistic people [94] and virtual learning environments are more acceptable and preferred by autistic youth compared to live role-plays with teachers [95].
Overall, this study aims to fill a gap in autism services research by evaluating both the effectiveness and implementation of WorkChat to enhance pre-employment transition services delivered in post-secondary educational settings. Also, this study will be among the first to use a novel Hybrid Type I effectiveness-implementation design that simultaneously tests an intervention's effectiveness while conducting an in-depth process evaluation of WorkChat in a post-secondary Pre-ETS setting. Thus, this design will guide our initial understanding of whether comprehensive technology-based interventions such as WorkChat: 1) can be feasibly delivered with fidelity in and is appropriate for post-secondary Pre-ETS; 2) is acceptable to the administrators, career readiness counselors, and (useable for) autistic youth in this setting; and 3) have identifiable determinants of implementation (i.e., barriers and facilitators). Notably, this Hybrid Type I design has the potential to lessen the time needed to transform this innovative research into practice [27].
3.1. Potential enhancements for individuals and systems
Assuming that WorkChat enhances social ability in the work setting for autistic transition-age youth, the study results could potentially have implications for both individuals and systems. Specifically, WorkChat could help autistic transition-age youth engage in repeated practice of skills where teachers spend less time facilitating role plays and focus more on helping students digest the WorkChat feedback to help improve future performance. Also, WorkChat implementation within MCTI could translate into an expansion of services for other programs like MCTI who provide post-secondary Pre-ETS. In addition, a demonstration of WorkChat effectiveness could also translate into adoption by corporations who have committed to hiring an autistic workforce and could implement WorkChat as part of their onboarding procedures.
3.2. Pre-funding modifications to the trial
After scientific review by the National Institute of Mental Health, the scientific review officer recommended two changes to the proposed exploration of potential mechanisms. In response, we will evaluate if WorkChat reduces social anxiety, which in turn may facilitate greater duration of employment. In addition, we will explore whether baseline social challenges moderate the relationship between participating in WorkChat and sustaining employment.
3.3. Post-funding modifications to the trial
After the study was funded, we convened community advisory board, diversity advisory board, and scientific advisory board meetings (including 1:1 check-in meetings or feedback solicited via email) to review the proposed study procedures and provide recommended modifications.
3.3.1. Modifications requested by the advisory boards
After a full review of the RCT design and measures (for autistic youth and career readiness counselors), the community advisory board and diversity advisory board did not recommend any changes to the study procedures (including study measures, data collection plan, or WorkChat implementation plan). The scientific advisory board made three notable recommendations. First, the scientific advisory board recommended that the study collect individual-level data that reflect their engagement in the five domains of Pre-ETS: job exploration counseling, work-based learning experiences, workplace readiness training, self-advocacy, and post-secondary exploration [13]. Given that the field does not yet have an empirically validated measurement of Pre-ETS fidelity at the individual-level, we asked students to reflect on whether or not they engaged in 20 common activities that occur during the aforementioned five Pre-ETS domains. Second, they recommended that we evaluate the implementation preparation costs to prepare MCTI to deliver WorkChat and the costs of actually delivering WorkChat using budget impact analysis. We will generate a spreadsheet template that other schools can use to input site-specific parameters needed to estimate their costs of preparing to deliver WorkChat. We will evaluate costs from the implementing organization perspective (MCTI), which is consistent with best practices for budget impact analyses [96]. We will use sensitivity analysis to estimate the range of replicating the implementation preparation costs for future sites. This economic evaluation can help provide transparency around the administrator and staff workload required to prepare new sites to implement WorkChat, which have recently emerged as helpful in community-based research settings [[97], [98], [99]]. Third, they recommended capturing performance-based measures of social ability and emotion regulation. Thus, we added a work-based version of the social skills performance assessment using two 3-min role-plays to evaluate how someone interacts with a customer and supervisor [100]. Next, we added the ‘managing emotions’ subscale of the Mayer-Salovey-Caruso Emotional Intelligence Test [101]. The Penn Emotion Recognition Task was also replaced in favor of using an assessment with more dynamic stimuli, a short multi-channel version of the Profile of Nonverbal Sensitivity [102].
3.3.2. Modifications requested by MCTI
The MCTI administrative team reviewed the study design as proposed during the initial grant application during our RCT preparation. They requested five revisions to the study design. First, they indicated that they have current student reading level records. Thus, to reduce participant burden with measures added to the study in 3.3.1, they requested that we remove the research-led reading assessment. Second, they requested additional reduction in the number or time required to collect assessments. To compromise, the research team changed the neuropsychological assessment from the NIH Toolbox cognitive battery (requiring approximately 45 min to complete) to the Brief Assessment of Cognition (requiring approximately 20 min). Third, due to the ongoing staffing shortages related to the COVID-19 pandemic and limited computer lab availability, MCTI requested that we remove the attention control condition. Given that MCTI serves as a services-as-usual setting and that an attention control had no impact when evaluating the effects of cognitive remediation in RCTs [103,104], we removed the attention control based on the school's request. Fourth, MCTI advised that parent response rate may be lower than anticipated given the strains of recovery from the COVID-19 pandemic. Given that participants are adults, they advised that we limit our requests to the autistic transition-age youth. Thus, we replaced the parent-report of the adult behavior checklist [48] with the self-report version of the self-directedness subscale from the Adaptive Behavior Assessment System (3rd Edition) [105]. Lastly, autistic students at MCTI could have a SRS-2 score of 60–64. The eligibility criteria required a SRS-2 score of at least 65 was required to reduce false positives. MCTI indicated that their autistic students already received a clinical diagnosis of autism or had autism identified on their individualized education program. Thus, excluding potential autistic participants with an SRS-2 of 60–64 would not be equitable and inappropriate given WorkChat was designed for autistic youth in an educational setting. To be responsive, we will not exclude potential autistic participants who are enrolled at MCTI and have ‘autism’ identified on their educational record according to the standards set forth by the Individuals with Disabilities Education Act [106].
3.4. Conclusion
This is the first RCT evaluating WorkChat in a post-secondary Pre-ETS program. This innovative design is a critical step in the translational research pipeline, as WorkChat will be integrated in the typical MCTI workflow with a certain degree of unpredictability regarding how career readiness counselors, students, and administrators will respond. This study is unique high-risk effort as there have been no prior efficacy trials of this particular intervention. However, the methods to design and evaluate this intervention have followed community-engaged methods that have recently demonstrated real-world effectiveness for similar simulation-type interventions with virtual characters as the foci of training conversational skills. Thus, the Hybrid Type I effectiveness-implementation evaluation design [27] was an intentional effort to expedite the translational process and scale-up by more than five years by skipping small scale efficacy trials in favor of a fully-powered effectiveness trial aiming to evaluate whether WorkChat will have a positive impact on sustaining employment for autistic transition-age youth engaged in post-secondary Pre-ETS. The success of this high-risk rapid translation will depend on whether WorkChat is effective and the implementation evaluation reveals salient factors to support for future adoption.
Declaration of competing interest
The authors declare the following financial interests/personal relationships which may be considered as potential competing interests:SIMmersion LLC will receive revenue from future sales of WorkChat: A Virtual Workday. These royalties will be shared with Dr. Matthew J. Smith. The University of Michigan's Conflict Management Office developed a conflict management plan for Dr. Smith that was reviewed and approved by a University of Michigan Conflict of Interest Committee. This manuscript includes authorship by Dr. Olsen, Mr. Elkins, Ms. Humm, and Dr. Steacy who are paid employees of SIMmersion and own stock in the company. However, Dr. Olsen, Mr. Elkins, Ms. Humm, and Dr. Steacy were not involved in the study's design and will not be involved in the administration of the randomized controlled trial. Their authorship contributions were focused on the description of the intervention.
Acknowledgements
This study was supported by a grant to Dr. Dale Olsen, Ph.D. (R44 MH123359) from the National Institute of Mental Health.
Footnotes
Supplementary data to this article can be found online at https://doi.org/10.1016/j.conctc.2023.101153.
Appendix A. Supplementary data
The following is the Supplementary data to this article:
References
- 1.Shattuck P.T., Narendorf S.C., Cooper B., Sterzing P.R., Wagner M., Taylor J.L. Postsecondary education and employment among youth with an autism spectrum disorder. Pediatrics. 2012;129(6):1042–1049. doi: 10.1542/peds.2011-2864. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Shattuck P.T., Roux A.M., Hudson L.E., Taylor J.L., Maenner M.J., Trani J.F. Services for adults with an autism spectrum disorder, Canadian journal of psychiatry. Rev. Canad. Psychiatr. 2012;57(5):284–291. doi: 10.1177/070674371205700503. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Taylor J.L., Seltzer M.M. Employment and post-secondary educational activities for young adults with autism spectrum disorders during the transition to adulthood. J. Autism Dev. Disord. 2011;41(5):566–574. doi: 10.1007/s10803-010-1070-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Mazefsky C.A., Borue X., Day T.N., Minshew N.J. Emotion regulation patterns in adolescents with high-functioning autism spectrum disorder: comparison to typically developing adolescents and association with psychiatric symptoms. Autism Res. 2014;7(3):344–354. doi: 10.1002/aur.1366. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Rutherford M.D., Baron-Cohen S., Wheelwright S. Reading the mind in the voice: a study with normal adults and adults with Asperger syndrome and high functioning autism. J. Autism Dev. Disord. 2002;32(3):189–194. doi: 10.1023/a:1015497629971. [DOI] [PubMed] [Google Scholar]
- 6.Samson A.C., Huber O., Gross J.J. Emotion regulation in Asperger's syndrome and high-functioning autism. Emotion. 2012;12(4):659–665. doi: 10.1037/a0027975. [DOI] [PubMed] [Google Scholar]
- 7.Dew D.W., Alan G.M. The George Washington University; Washington D.C.: 2007. Rehabilitation of Individuals with Autism Spectrum Disorders (Institute on Rehabilitation: Issue Monograph No. 32) [Google Scholar]
- 8.Hurlbutt K.K. Employment and adults with Asperger syndrome. Focus. Autism. Other.Dev. Disabil. 2004;19(4):215–222. [Google Scholar]
- 9.Unger D. Workplace supports: a view from employers who have hired supported employees. Focus. Autism. Other.Dev. Disabil. 1999;14(3):167–179. [Google Scholar]
- 10.Chen J.L., Leader G., Sung C., Leahy M. Trends in employment for individuals with autism spectrum disorder: a review of the research literature. Rev. J.Autism Dev. Disord. 2015;2(2):115–127. [Google Scholar]
- 11.Taylor J.L., Adams R.E., Pezzimenti F., Zheng S., Bishop S.L. Job loss predicts worsening depressive symptoms for young adults with autism: a COVID-19 natural experiment. Autism Res. 2022;15(1):93–102. doi: 10.1002/aur.2621. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Hedley D., Uljarevic M., Hedley D.F.E. In: Inclusion, Disability, and Culture: an Ethnographic Perspective Traversing Abilities and Challenges. Halder S., Assaf L.C., editors. Springer; 2017. Employment and living with autism: personal, social, and economic impact; pp. 295–311. [Google Scholar]
- 13.Workforce Innovation and Opportunity Act WIOA. United States of America; 2014. [Google Scholar]
- 14.Sung C., Connor A., Chen J., Lin C.C., Kuo H.J., Chun J. Development, feasibility, and preliminary efficacy of an employment-related social skills intervention for young adults with high-functioning autism. Autism: Int. J.Res. Pract. 2018 doi: 10.1177/1362361318801345. [DOI] [PubMed] [Google Scholar]
- 15.Connor A., Sung C., Strain A., Zeng S., Fabrizi S. Building skills, confidence, and wellness: psychosocial effects of Soft skills training for young adults with autism. J. Autism Dev. Disord. 2020;50(6):2064–2076. doi: 10.1007/s10803-019-03962-w. [DOI] [PubMed] [Google Scholar]
- 16.Smith M.J., Pinto R.M., Dawalt L., Smith J.D., Sherwood K., Miles R., Taylor J.L., Hume K.A., Dawkins T., Baker-Ericzén M., Frazier T.W., Humm L., Steacy C. Using community-engaged methods to adapt virtual reality job-interview training for transition-age youth on the autism spectrum. Res. Autism.Spectr.Disord. 2020;71 doi: 10.1016/j.rasd.2019.101498. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Smith M.J., Sherwood K., Ross B., Smith J.D., DaWalt L., Bishop L., Humm L., Elkins J., Steacy C. Virtual interview training for autistic transition age youth: a randomized controlled feasibility and effectiveness trial. Autism: Int. J.Res. Pract. 2021;25(6):1536–1552. doi: 10.1177/1362361321989928. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Smith M.J., Smith J.D., Jordan N., Sherwood K., McRobert E., Ross B., Oulvey E.A., Atkins M.S. Virtual reality job interview training in transition services: results of a single-arm, noncontrolled effectiveness-implementation hybrid trial. J. Spec. Educ. Technol. 2021;36(1):3–17. doi: 10.1177/0162643420960093. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Smith M.J., Sherwood K., Ross B., Oulvey E.A., Atkins M.S., Danielson E.A., Jordan N., Smith J.D. Scaling out virtual interview training for transition age youth: a quasi-experimental hybrid effectiveness-implementation study. Career Dev.Transit.Exceptional Individ. 2022;45(4):213–227. doi: 10.1177/21651434221081273. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Sherwood K., Smith M.J., Johnson J., Harrington M., Blajeski S., Ross B., Dawalt L., Bishop L., Smith J.D. Mixed-methods Implementation Evaluation of Virtual Interview Training for Transition-Age Autistic Youth in Pre-employment Transition Services. J. Vocat. Rehabil. 2023;58:139–154. doi: 10.3233/jvr-230004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Kandalaft M.R., Didehbani N., Krawczyk D.C., Allen T.T., Chapman S.B. Virtual reality social cognition training for young adults with high-functioning autism. J. Autism Dev. Disord. 2013;43(1):34–44. doi: 10.1007/s10803-012-1544-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Stichter J.P., Laffey J., Galyen K., Herzog M. iSocial: delivering the Social Competence Intervention for Adolescents (SCI-A) in a 3D virtual learning environment for youth with high functioning autism. J. Autism Dev. Disord. 2014;44(2):417–430. doi: 10.1007/s10803-013-1881-0. [DOI] [PubMed] [Google Scholar]
- 23.Strickland D.C., Coles C.D., Southern L.B. JobTIPS: a transition to employment program for individuals with autism spectrum disorders. J. Autism Dev. Disord. 2013;43(10):2472–2483. doi: 10.1007/s10803-013-1800-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Jones D.R., Mandell D.S. To address racial disparities in autism research, we must think globally, act locally. Autism: Int. J.Res. Pract. 2020;24(7):1587–1589. doi: 10.1177/1362361320948313. [DOI] [PubMed] [Google Scholar]
- 25.Steinbrenner J.R., McIntyre N., Rentschler L.F., Pearson J.N., Luelmo P., Jaramillo M.E., Boyd B.A., Wong C., Nowell S.W., Odom S.L., Hume K.A. Patterns in reporting and participant inclusion related to race and ethnicity in autism intervention literature: data from a large-scale systematic review of evidence-based practices. Autism: Int. J.Res. Pract. 2022 doi: 10.1177/13623613211072593. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Williams E.G., Smith M.J., Boyd B. Perspective: the role of diversity advisory boards in autism research. Autism: Int. J.Res. Pract. 2022 doi: 10.1177/13623613221133633. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Curran G.M., Smith J.D., Landsverk J., Brown C.H., Vermeer W., Miech E.J., B K., Cruden G., Fernandez M. In: Dissemination and Implementation Research in Health: Translating Research to Practice. Brownson R.C., Colditz G.A., Proctor E.K., editors. Oxford University Press; New York, NY: 2022. Design and analysis in dissemination and implementation research; pp. 201–228. [Google Scholar]
- 28.Smith M.J., Smith J.D., Fleming M.F., Jordan N., Oulvey E.A., Bell M.D., Mueser K.T., McGurk S.R., Spencer E.S., Mailey K., Razzano L.A. Enhancing individual placement and support (IPS) - supported employment: a Type 1 hybrid design randomized controlled trial to evaluate virtual reality job interview training among adults with severe mental illness. Contemp. Clin. Trials. 2019;77:86–97. doi: 10.1016/j.cct.2018.12.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Smith M.J., Mitchell J.A., Blajeski S., Parham B., Harrington M.M., Ross B., Sinco B., Brydon D.M., Johnson J.E., Cuddeback G.S., Smith J.D., Jordan N., Bell M.D., McGeorge R., Kaminski K., Suganuma A., Kubiak S.P. Enhancing vocational training in corrections: a type 1 hybrid randomized controlled trial protocol for evaluating virtual reality job interview training among returning citizens preparing for community re-entry. Contemp. Clin.Trials. Commun. 2020;19 doi: 10.1016/j.conctc.2020.100604. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Aldridge F.J., Gibbs V.M., Schmidhofer K., Williams M. Investigating the clinical usefulness of the Social Responsiveness Scale (SRS) in a tertiary level, autism spectrum disorder specific assessment clinic. J. Autism Dev. Disord. 2012;42(2):294–300. doi: 10.1007/s10803-011-1242-9. [DOI] [PubMed] [Google Scholar]
- 31.Constantino J.N., Gruber C.P. second ed. (SRS-2), second ed. Western Psychological Services; Los Angeles, CA: 2012. Social Responsiveness Scale. [Google Scholar]
- 32.Wilkinson G.S., Robertson G.J. fifth ed. (WRAT-5) Pearson, Inc.; Bloomington, MN: 2017. Wide Range Achievement Test. [Google Scholar]
- 33.Fisher M., Holland C., Merzenich M.M., Vinogradov S. Using neuroplasticity-based auditory training to improve verbal memory in schizophrenia. Am. J. Psychiatr. 2009;166(7):805–811. doi: 10.1176/appi.ajp.2009.08050757. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Lindenmayer J.P., McGurk S.R., Mueser K.T., Khan A., Wance D., Hoffman L., Wolfe R., Xie H. A randomized controlled trial of cognitive remediation among inpatients with persistent mental illness. Psychiatr. Serv. 2008;59(3):241–247. doi: 10.1176/ps.2008.59.3.241. [DOI] [PubMed] [Google Scholar]
- 35.Eack S.M., Greenwald D.P., Hogarty S.S., Bahorik A.L., Litschge M.Y., Mazefsky C.A., Minshew N.J. Cognitive enhancement therapy for adults with autism spectrum disorder: results of an 18-month feasibility study. J. Autism Dev. Disord. 2013;43(12):2866–2877. doi: 10.1007/s10803-013-1834-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Eack S.M., Hogarty S.S., Greenwald D.P., Litschge M.Y., Porton S.A., Mazefsky C.A., Minshew N.J. Cognitive enhancement therapy for adult autism spectrum disorder: results of an 18-month randomized clinical trial. Autism Res. 2018;11(3):519–530. doi: 10.1002/aur.1913. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Office of Disability Employment Policy . Skills to Pay the Bills: Mastering Soft Skills for Workplace Success. United States Department of Labor; Washington, D.C.: 2012. [Google Scholar]
- 38.Wallace S., Parsons S., Westbury A., White K., White K., Bailey A. Sense of presence and atypical social judgments in immersive virtual environments. Responses of adolescents with Autism Spectrum Disorders, Autism: Int. J.Res. Pract. 2010;14(3):199–213. doi: 10.1177/1362361310363283. [DOI] [PubMed] [Google Scholar]
- 39.Cooper J.O. Applied behavior analysis in education. Theory Into Pract. 1982;21(2):114–118. [Google Scholar]
- 40.Cooper J.O., Heron T.E., Heward W.L. Pearson; London: 2007. Applied Behavioral Analysis. [Google Scholar]
- 41.Issenberg S.B. The scope of simulation-based healthcare education. Simulat. Healthc. J. Soc. Med. Simulat: J. Soc.Simulat.Healthc. 2006;1(4):203–208. doi: 10.1097/01.SIH.0000246607.36504.5a. [DOI] [PubMed] [Google Scholar]
- 42.Roelfsema P.R., van Ooyen A., Watanabe T. Perceptual learning rules based on reinforcers and attention. Trends Cognit. Sci. 2010;14(2):64–71. doi: 10.1016/j.tics.2009.11.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Vinogradov S., Fisher M., de Villers-Sidani E. Cognitive training for impaired neural systems in neuropsychiatric illness. Neuropsychopharmacology: Off. Publ. Am. Coll.Neuropsychopharmacol. 2012;37(1):43–76. doi: 10.1038/npp.2011.251. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Smith M.J., Parham B., Mitchelll J., Blajeski S., Harrington M., Ross B., Brydon D.M., Johnson J.E., Cuddeback G., Smith J.D., Jordan N., Bell M.D., McGeorge R., Kaminsky K., Suganuma A., Kubiak S.P. Virtual reality job interview training for adults receiving prison-based employment services: a randomized controlled feasibility and initial effectiveness trial. Crim. Justice Behav. 2022;50(2):272–293. doi: 10.1177/00938548221081447. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Smith M.J., Smith J.D., Blajeski S., Ross B., Jordan N., Bell M.D., McGurk S.R., Mueser K.T., Burke-Miller J.K., Oulvey E.A., Fleming M.F., Nelson K., Brown A., Prestipino J., Pashka N.J., Razzano L.A. An RCT of virtual reality job interview training for individuals with serious mental illness in IPS supported employment. Psychiatr. Serv. 2022;73(9):1027–1038. doi: 10.1176/appi.ps.202100516. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Akshoomoff N., Beaumont J.L., Bauer P.J., Dikmen S.S., Gershon R.C., Mungas D., Slotkin J., Tulsky D., Weintraub S., Zelazo P.D., Heaton R.K. VIII. NIH Toolbox Cognition Battery (CB): composite scores of crystallized, fluid, and overall cognition. Monogr. Soc. Res. Child Dev. 2013;78(4):119–132. doi: 10.1111/mono.12038. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Kroenke K., Strine T.W., Spitzer R.L., Williams J.B., Berry J.T., Mokdad A.H. The PHQ-8 as a measure of current depression in the general population. J. Affect. Disord. 2009;114(1–3):163–173. doi: 10.1016/j.jad.2008.06.026. [DOI] [PubMed] [Google Scholar]
- 48.Achenbach T.M., Rescorla L.A. University of Vermont; Burlington, VT: 2001. Manual for the ASEBA School-Age Forms and Profiles. [Google Scholar]
- 49.Kohler C.G., Turner T.H., Gur R.E., Gur R.C. Recognition of facial emotions in neuropsychiatric disorders. CNS Spectr. 2004;9(4):267–274. doi: 10.1017/s1092852900009202. [DOI] [PubMed] [Google Scholar]
- 50.Corcoran R., Mercer G., Frith C.D. Schizophrenia, symptomatology and social inference: investigating "theory of mind" in people with schizophrenia. Schizophr. Res. 1995;17(1):5–13. doi: 10.1016/0920-9964(95)00024-g. [DOI] [PubMed] [Google Scholar]
- 51.Morrison K.E., Pinkham A.E., Kelsven S., Ludwig K., Penn D.L., Sasson N.J. Psychometric evaluation of social cognitive measures for adults with autism. Autism Res. 2019;12(5):766–778. doi: 10.1002/aur.2084. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Mazefsky C.A., Day T.N., Siegel M., White S.W., Yu L., Pilkonis P.A. Autism, C. Developmental disabilities inpatient research, development of the emotion dysregulation inventory: a PROMIS(R)ing method for creating sensitive and unbiased questionnaires for autism spectrum disorder. J. Autism Dev. Disord. 2018;48(11):3736–3746. doi: 10.1007/s10803-016-2907-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Mazefsky C.A., Yu L., White S.W., Siegel M., Pilkonis P.A. The emotion dysregulation inventory: psychometric properties and item response theory calibration in an autism spectrum disorder sample. Autism Res. 2018;11(6):928–941. doi: 10.1002/aur.1947. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Bryson G., Bell M.D., Lysaker P., Zito W. The Work Behavior Inventory: a scale for the assessment of work behavior for people with severe mental illness. Psychiatr. Rehabil. J. 1997;20(4):47–55. [Google Scholar]
- 55.Mattick R.P., Clarke J.C. Development and validation of measures of social phobia scrutiny fear and social interaction anxiety. Behav. Res. Ther. 1998;36(4):455–470. doi: 10.1016/s0005-7967(97)10031-6. [DOI] [PubMed] [Google Scholar]
- 56.Maddox B.B., White S.W. Comorbid social anxiety disorder in adults with autism spectrum disorder. J. Autism Dev. Disord. 2015;45(12):3949–3960. doi: 10.1007/s10803-015-2531-5. [DOI] [PubMed] [Google Scholar]
- 57.United States Department of Labor and Employment and Training Administration Affairs . U. S. Employment Service; Washington, D.C.: 1991. Dictionary of Occupational Titles. [Google Scholar]
- 58.Cook J.A., Leff H.S., Blyler C.R., Gold P.B., Goldberg R.W., Mueser K.T., Toprac M.G., McFarlane W.R., Shafer M.S., Blankertz L.E., Dudek K., Razzano L.A., Grey D.D., Burke-Miller J. Results of a multisite randomized trial of supported employment interventions for individuals with severe mental illness. Arch. Gen. Psychiatr. 2005;62(5):505–512. doi: 10.1001/archpsyc.62.5.505. [DOI] [PubMed] [Google Scholar]
- 59.Curran G.M., Landes S.J., McBain S.A., Pyne J.M., Smith J.D., Fernandez M.E., Chambers D.A., Mittman B.S. Reflections on 10 years of effectiveness-implementation hybrid studies. Frontiers in Health Services. 2022;2 doi: 10.3389/frhs.2022.1053496. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Reilly K.L., Kennedy S., Porter G., Estabrooks P. Comparing, contrasting, and integrating dissemination and implementation outcomes included in the RE-AIM and implementation outcomes frameworks. Front. Public Health. 2020;8 doi: 10.3389/fpubh.2020.00430. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Weiner B.J., Lewis C.C., Stanick C., Powell B.J., Dorsey C.N., Clary A.S., Boynton M.H., Halko H. Psychometric assessment of three newly developed implementation outcome measures. Implement. Sci. 2017;12(1) doi: 10.1186/s13012-017-0635-3. 108. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Proctor E., Silmere H., Raghavan R., Hovmand P., Aarons G., Bunger A., Griffey R., Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm. Pol. Ment. Health. 2011;38(2):65–76. doi: 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Stirman S.W., Miller C.J., Toder K., Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement. Sci. 2013;8:65. doi: 10.1186/1748-5908-8-65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Stirman S.W., Gamarra J., Bartlett B., Calloway A., Gutner C. Empirical examinations of modifications and adaptations to evidence-based psychotherapies: methodologies, impact, and future directions. Clin. Psychol. 2017;24(4):396–420. doi: 10.1111/cpsp.12218. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Damschroder L., Reardon C., Widerquist M.O., Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback.Implement. Sci. 2022;17(1):75. doi: 10.1186/s13012-022-01245-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Reimers T.M., Wacker D.P. Parents' Ratings of the acceptability of behavioral treatment recommendations made in an outpatient clinic: a preliminary analysis of the influence of treatment effectiveness. Behav. Disord. 1988;14(1):7–15. [Google Scholar]
- 67.Lewis J.R. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int. J. Hum. Comput. Interact. 1995;7(1):57–78. [Google Scholar]
- 68.Molenberghs G., Verbeke G. Springer; New York: 2000. Linear Mixed Models for Longitudinal Data. [Google Scholar]
- 69.Diggle P.J., Heagerty P., Liang L., Zeger S.L. Oxford University Press; New York: 2002. Analysis of Longitudinal Data. [Google Scholar]
- 70.Z S. Rectangular confidence regions for the means of multivariate normal distributions. J. Am. Stat. Assoc. 1967;62:626–633. [Google Scholar]
- 71.Kraemer H.C., Gibbons R.D. Why does the randomized clinical trial methodology so often mislead clinical decision making? Focus on moderators and mediators of treatment. Psychiatr. Ann. 2009;39(7):736–745. [Google Scholar]
- 72.Kraemer H.C., Wilson G.T., Fairburn C.G., Agras W.S. Mediators and moderators of treatment effects in randomized clinical trials. Arch. Gen. Psychiatr. 2002;59(10):877–883. doi: 10.1001/archpsyc.59.10.877. [DOI] [PubMed] [Google Scholar]
- 73.MacKinnon D.P., Fairchild A.J., Fritz M.S. Mediation analysis. Annu. Rev. Psychol. 2007;58:593–614. doi: 10.1146/annurev.psych.58.110405.085542. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Preacher K.J., Hayes A.F. SPSS and SAS procedures for estimating indirect effects in simple mediation models. Behav. Res. Methods Instrum. Comput. 2004;36(4):717–731. doi: 10.3758/bf03206553. [DOI] [PubMed] [Google Scholar]
- 75.Guest G., MacQueen K.M., Namey E.E. Sage Publications, Inc.; Thousand Oaks, CA: 2012. Applied Thematic Analysis. [Google Scholar]
- 76.Lincoln Y., Guba E. Sage Publications; London: 1985. Processing Naturally Obtained Data, Naturalistic Inquiry; pp. 332–356. [Google Scholar]
- 77.McHugh M.L. Interrater reliability: the kappa statistic. Biochem. Med. 2012;22(3):276–282. [PMC free article] [PubMed] [Google Scholar]
- 78.Bernard H.R., Ryan G.W. In: Handbook of Methods in Cultural Anthropology. Bernard H.R., editor. Altamira Press; Walnut Creek, CA: 1998. Text analysis: qualitative and quantitative methods. [Google Scholar]
- 79.Kurasaki K. Intercoder reliability for validating conclusions drawn from open-ended interview data. Field Methods. 2000;12:179–194. [Google Scholar]
- 80.Green J., Thorogood N. third ed. Sage Publications; Los Angeles, CA: 2014. Qualitative Methods for Health Research. [Google Scholar]
- 81.Crabtree B.F., Miller W.L. In: Doing Qualitative Research. Crabtree B.F., Miller M.L., editors. Sage Publications; Thousand Oaks, CA: 1992. A template approach to text analysis: developing and using codebooks; pp. 93–109. [Google Scholar]
- 82.Jakobsen J.C., Gluud C., Wetterslev J., Winkel P. When and how should multiple imputation be used for handling missing data in randomised clinical trials – a practical guide with flowcharts. BMC Med. Res. Methodol. 2017;17(1):162. doi: 10.1186/s12874-017-0442-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.Smith M.J., Bell M.D., Wright M.A., Humm L.B., Olsen D., Fleming M.F. Virtual reality job interview training and 6-month employment outcomes for individuals with substance use disorders seeking employment. J. Vocat. Rehabil. 2016;44:323–332. doi: 10.3233/JVR-160802. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Smith M.J., Boteler Humm L., Fleming M.F., Jordan N., Wright M.A., Ginger E.J., Wright K., Olsen D., Bell M.D. Virtual reality job interview training for veterans with posttraumatic stress disorder. J. Vocat. Rehabil. 2015;42:271–279. doi: 10.3233/JVR-150748. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Smith M.J., Fleming M.F., Wright M.A., Roberts A.G., Humm L.B., Olsen D., Bell M.D. Virtual reality job interview training and 6-month employment outcomes for individuals with schizophrenia seeking employment. Schizophr. Res. 2015;166(1–3):86–91. doi: 10.1016/j.schres.2015.05.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Smith M.J., Ginger E.J., Wright K., Wright M.A., Taylor J.L., Humm L.B., Olsen D.E., Bell M.D., Fleming M.F. Virtual reality job interview training in adults with autism spectrum disorder. J. Autism Dev. Disord. 2014;44(10):2450–2463. doi: 10.1007/s10803-014-2113-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Smith M.J., Ginger E.J., Wright M., Wright K., Boteler Humm L., Olsen D., Bell M.D., Fleming M.F. Virtual reality job interview training for individuals with psychiatric disabilities. J. Nerv. Ment. Dis. 2014;202(9):659–667. doi: 10.1097/NMD.0000000000000187. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Parsons S., Mitchell P. The potential of virtual reality in social skills training for people with autistic spectrum disorders. J. Intellect. Disabil. Res.: JIDR (J. Intellect. Disabil. Res.) 2002;46(Pt 5):430–443. doi: 10.1046/j.1365-2788.2002.00425.x. [DOI] [PubMed] [Google Scholar]
- 89.Tse J., Strulovitch J., Tagalakis V., Meng L., Fombonne E. Social skills training for adolescents with Asperger syndrome and high-functioning autism. J. Autism Dev. Disord. 2007;37(10):1960–1968. doi: 10.1007/s10803-006-0343-3. [DOI] [PubMed] [Google Scholar]
- 90.Rowe D.A., Mazzotti V.L., Fowler C.H., Test D.W., Mitchell V.J., Clark K.A., Holzberg D., Owens T.L., Rusher D., Seaman-Tullis R.L., Gushanas C.M., Castle H., Chang W.-H., Voggt A., Kwiatek S., Dean C. Updating the secondary transition research base: evidence- and research-based practices in functional skills. Career Dev.Transit.Exceptional Individ. 2021;44(1):28–46. [Google Scholar]
- 91.Modugumudi Y., Santhosh J., Anand S. Efficacy of collaborative virtual environment intervention programs in emotion expression of children with autism. J. Med. Imaging Health Inform. 2013;3(2):321–325. [Google Scholar]
- 92.Moore D., Cheng Y., McGrath P., Powell N.J. Collaborative virtual environment technology for people with autism. Focus. Autism. Other.Dev. Disabil. 2005;20(4):231–243. [Google Scholar]
- 93.Moore D., McGrath P., Thorpe J. Computer-aided learning for people with autism—a framework for research and development. Innovat. Educ. Train. Int. 2000;37(3):218–228. [Google Scholar]
- 94.Grynszpan O., Weiss P.L., Perez-Diaz F., Gal E. Innovative technology-based interventions for autism spectrum disorders: a meta-analysis. Autism: Int. J.Res. Pract. 2014;18(4):346–361. doi: 10.1177/1362361313476767. [DOI] [PubMed] [Google Scholar]
- 95.Spencer S., Drescher T., Sears J., Scruggs A.F., Schreffler J. Comparing the efficacy of virtual simulation to traditional classroom role-play. J. Educ. Comput. Res. 2019;57(7):1772–1785. [Google Scholar]
- 96.Sullivan S.D., Mauskopf J.A., Augustovski F., Jaime Caro J., Lee K.M., Minchin M., Orlewska E., Penna P., Rodriguez Barrios J.M., Shau W.Y. Budget impact analysis-principles of good practice: report of the ISPOR 2012 budget impact analysis good practice II task force. Value Health: J. Int. Soc.Pharmacoeconomics.Outcomes Res. 2014;17(1):5–14. doi: 10.1016/j.jval.2013.08.2291. [DOI] [PubMed] [Google Scholar]
- 97.Jordan N., Graham A.K., Berkel C., Smith J.D. Costs of preparing to implement a family-based intervention to prevent pediatric obesity in primary care: a budget impact analysis. Prev. Sci. 2019;20(5):655–664. doi: 10.1007/s11121-018-0970-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Smith M.J., Graham A.K., Sax R., Spencer E.S., Razzano L.A., Smith J.D., Jordan N. Costs of preparing to implement a virtual reality job interview training programme in a community mental health agency: a budget impact analysis. J. Eval. Clin. Pract. 2020;26(4):1188–1195. doi: 10.1111/jep.13292. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 99.Danielson E., Smith M.J., Sherwood K., Atkins M., Smith J.D., Jordan N. Implementation preparation costs of virtual interview training for transition-age youth with disabilities: a budget impact analysis. J. Spec. Educ. Technol. 2023 doi: 10.1177/01626434231175372. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Baker-Ericzen M., Fitch M., Jenkins M., Twamley E.W., Brookman-Frazee L., Patterson T.L. In: Social Skills Performance Assessment for Autism Spectrum and Related Social Conditions: Education & Employment Versions. University S.D.S., editor. 2018. (Training Manual). [Google Scholar]
- 101.Mayer J.D., Salovey P., Caruso D.R. MHS Publishers; Toronto: 2002. Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) User's Manual. [Google Scholar]
- 102.Bänziger T., Scherer K.R., Hall J.A., Rosenthal R. Introducing the MiniPONS: a short multichannel version of the profile of nonverbal sensitivity (PONS) J. Nonverbal Behav. 2011;35(3):189–204. [Google Scholar]
- 103.Wykes T., Huddy V., Cellard C., McGurk S.R., Czobor P. A meta-analysis of cognitive remediation for schizophrenia: methodology and effect sizes. Am. J. Psychiatr. 2011;168(5):472–485. doi: 10.1176/appi.ajp.2010.10060855. [DOI] [PubMed] [Google Scholar]
- 104.Freedland K.E. Demanding attention: reconsidering the role of attention control groups in behavioral intervention research. Psychosom. Med. 2013;75(2):100–102. doi: 10.1097/PSY.0b013e3182851b75. [DOI] [PubMed] [Google Scholar]
- 105.Harrison P.L., Oakland T. third ed. Western Psychological Services; 2015. Adaptive Behavior Assessment System. [Google Scholar]
- 106.Individuals with Disabilities Education Act. United States of America; 2004. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.


