Abstract
This study reports the results of a single-arm non-controlled, Type 3 hybrid effectiveness-implementation trial evaluating virtual reality job-interview training (VR-JIT) delivered in five pre-employment transition programs comprising 15 schools, 10 administrators, 23 teachers, and 279 youth ages 16–21 years receiving special education pre-employment transition services. Fidelity, expected implementation feasibility, and teacher and student acceptance of VR-JIT were high. Youth completed virtual interviews over six to eight weeks (M = 10.8, SD = 7.4). At the six-month follow-up, teachers reported that youth using VR-JIT had employment rates higher than current national employment rates for youth with disabilities. A multinomial logistic regression revealed VR-JIT engagement was associated with greater employment rates by six-month follow-up (OR=1.63, p=0.002). This study provides promising evidence that VR-JIT may be feasibly implemented with high fidelity in special education pre-employment transition services and can potentially enhance employment outcomes among transition-age youth receiving special education services.
Approximately 400,000 transition-age youth (ages 16–21 years) with disabilities who receive federally-mandated transition services leave high school each year (U.S. Department of Education, 2019). As these youth enter the workforce they face major disparities in employment compared to their same-age peers without disabilities. Specifically, 18.4% of 16–19 year olds and 40.2% of 20–24 year olds with disabilities are employed, compared to 31.4% and 68.5%, respectively, of non-disabled, same-age peers (Bureau of Labor Statistics, 2020). To target this employment disparity, the Individuals with Disabilities Education Act (Individuals with Disabilities Education Act, 2004) mandated the inclusion of pre-employment transition services offered via secondary and post-secondary educational programs (Workforce Innovation and Opportunity Act, 2014).
Thus, the implementation of evidence-based practices are critical to supporting pre-employment transition services for youth with disabilities entering the workforce. The U.S. Department of Education houses two resources where school administrators and teachers typically access evidence-based (and other) practices to support pathways to graduation, including pre-employment transition services. The first resource is the What Works Clearinghouse (WWC) database which includes practices that enhance employment outcomes via community-based work experiences (Cobb et al., 2013). The second resource is the National Technical Assistance Center on Transition (NTACT, 2020) website of recommended practices that provides rankings reflecting an ordered range from evidence-based practices to research-based practices to promising practices to unestablished practices for training in pre-employment skills, job specific tasks, social skills at work, and community-based work experiences (e.g., Project SEARCH; Persch et al., 2015; Wehman et al., 2019).
Job interviewing is one pre-employment skill, among many, that is a highlighted treatment target by the U.S. Department of Education (U.S. Department of Education, 2017). However, the only practice identified between the WWC and NTACT that targets job interview skills is a ‘promising practice’ that evaluated the efficacy of video modeling (i.e., watching a pre-recorded video of someone performing a skill and then practicing or modeling the observed behavior) to train job interview skills in 15 youth with disabilities (Hayes et al., 2015). Thus, there is a major gap in available effective practices targeting job interview skills.
Despite this gap, job interview training is a commonly offered component of transition services (Carter, Trainor, et al., 2010). Most pre-employment transition services that include a job interview component rely on mock interview role-playing methods to facilitate practicing for job interviews among youth receiving special education (Lorenz et al., 2016; Wilczynski et al., 2013). However, as noted above, the field lacks a rigorously evaluated practice to support teachers in training students in job interviewing. Specifically, early evidence on in-person mock job interview training has been mixed regarding whether this method enhances interview skill or access to jobs (Campion & Campion, 1987; Tross & Maurer, 2008), and stakeholders with disabilities have voiced concerns about the difficulty of interviewing and the need for training (Jans et al., 2012; Sarrett, 2017). In response to this gap in evidence-based practices, novel job interview training tools recently emerged with evidence to support their initial feasibility and efficacy among youth and young adults with disabilities through small studies. Most groups refined the traditional in-person role-play methods (Hutchinson et al., 2019; Lindsay et al., 2015; Morgan et al., 2014; Rosales & Whitlow, 2019), while others developed job interview training tools using the latest technologies (Burke et al., 2018; Smith et al., 2014; Strickland et al., 2013).
As technology has become more affordable and accessible, schools have begun embracing virtual learning environments (VLEs; i.e., virtual reality, computerized simulations) to facilitate education (e.g., STEM, physical) for youth receiving special education services (Gregg et al., 2017; McMahon et al., 2019). Specifically, the infusion of VLEs into education takes advantage of the attraction students hold toward it, and transfers that enthusiasm from entertainment into ‘infotainment’ or ‘edutainment’ (Gadelha, 2018). Using VLEs as a form of instruction increases engagement, interactivity, and motivation in students and may free up teachers to facilitate more individualized instruction or spend more time planning lessons (Politis et al., 2017; Thorsteinsson & Shavinina, 2013). Moreover, VLEs have emerged as one of the most widely affordable, accessible, and accepted options for the delivery of virtual content in schools (Bellani et al., 2011; Mikropoulos & Natsis, 2011).
Thus, several groups recently developed VLE-based job interview training tools, which have notable strengths over traditional in-person, role-play methods and video modeling methods (Burke et al., 2018; Smith et al., 2014; Strickland et al., 2013). First, in-person role-playing and video modeling are resource intensive (German et al., 2018). Teachers have limited capacity and conducting multiple mock interviews or giving feedback on video modeling are not standard practice (Gresham et al., 2001). Thus, youth receiving special education services are less likely to engage in the repeated practice needed to develop skills. Second, VLEs may be more scalable (e.g., there are more computers than teachers; Gray et al., 2010). Third, VLEs create safe and non-threatening environments for youth with disabilities to make mistakes and learn without feeling embarrassed in front of role-players (Modugumudi et al., 2013). Fourth, VLEs provide clinician-led (Burke et al., 2018; Strickland et al., 2013) or automated feedback (Smith et al., 2014). Lastly, VLEs are more acceptable and preferred by students, compared to role-plays with teachers (Spencer et al., 2019).
One recent VLE is Virtual Reality Job Interview Training (VR-JIT, www.simmersion.com), which is a comprehensive internet-delivered job interview simulator. VR-JIT trainees review an e-learning curriculum to learn about several job interview skills adapted from a theoretical job interview framework (Huffcutt, 2011). Then trainees complete a job application where their responses inform an algorithm that generates interview questions tailored to an open position chosen by the trainee. From here, trainees engage in repetitively practiced job interviews with a virtual hiring manager and receive feedback during and after their performance. Notably, VR-JIT has several strengths compared to the other VLEs. Specifically, VR-JIT feedback includes real-time non-verbal cues, automated performance assessments, and a large library of job interview questions (approximately 1000; Smith et al., 2014). In contrast, the other VLEs have greater instructor-led training requirements and equipment needs (e.g., cameras), clinician-led feedback, and have limited job interview questions (approximately 15; Burke et al., 2018; Strickland et al., 2013).
Thus far, the efficacy of VR-JIT was evaluated in a series of five randomized controlled trials (RCTs) among adults with various disabilities (e.g., autism, depression, schizophrenia) in research laboratory settings. The results suggested that VR-JIT trainees improved their job interview skills, self-confidence, and employment rate as compared to control groups (e.g., Smith, Fleming, Wright, Jordan, et al., 2015; Smith, Fleming, Wright, Losh, et al., 2015; Smith et al., 2014). Moreover, the effects of VR-JIT were independently replicated in two non-controlled small feasibility trials in post-secondary transition service and university settings (Arter et al., 2018; Ward & Esposito, 2018). Thus, there are sufficient data to suggest the effects of VR-JIT may be generalizable to youth receiving special education services. In comparison, the other VLEs and job interview interventions were evaluated in single pilot trials and their generalizability is not yet known (Burke et al., 2018; Hutchinson et al., 2019; Lindsay et al., 2015; Morgan et al., 2014; Rosales & Whitlow, 2019; Strickland et al., 2013).
Given VR-JIT’s established efficacy at enhancing job interview skills and access to employment in controlled settings, a critical next step towards determining whether VR-JIT can be considered an evidence-based practice (based on the essential and desirable quality indicators [Gersten et al., 2005]) is to conduct an RCT to evaluate VR-JIT within special education pre-employment transition services. However, prior to conducting an RCT, we felt it would be prudent to first enhance our understanding of potential barriers and facilitators of VR-JIT implementation within a special education setting.
Thus, the purpose of the current study was to evaluate the feasibility and acceptability of VR-JIT implementation and its effectiveness for employment within special education pre-employment transition services using a single arm, non-controlled Type 3 hybrid effectiveness-implementation trial design. The Type 3 hybrid design has the primary aim of determining the utility of a strategy for the implementation of an intervention, and the secondary aim of assessing real-world outcomes associated with implementation (Curran et al., 2012). Thus, our primary aim is to gather empirical data reflecting the efforts and challenges that arise prior to and during VR-JIT implementation within existing transition services. Our secondary aim is to evaluate the preliminary effectiveness of VR-JIT in this setting. Hybrid designs allow for exploration of the relationship between implementation factors (e.g., acceptability of VR-JIT, fidelity to the protocol) and effectiveness outcomes for trainees, which is often overlooked in a typical effectiveness study. To our knowledge this is the first hybrid design study to evaluate VR-JIT in multiple schools delivering pre-employment transition services.
Our hypotheses and results related to VR-JIT implementation were organized using a well-established taxonomy in the field of implementation science salient to the early stages of implementation evaluation (Proctor et al., 2011). Specifically, we evaluated VR-JIT’s acceptability, usability, expected implementation feasibility, and fidelity, as well as the implementation context and delivery adaptations. Given that training providers to deliver interventions with high fidelity can be challenging (McHugh & Barlow, 2010), we carefully monitored whether fidelity to VR-JIT in this study was meeting or exceeding minimal standards established through prior research.
We also hypothesized that youth receiving special education pre-employment transition services and VR-JIT will have an increased likelihood of obtaining employment by six month follow-up. Although single arm, non-controlled designs do not have a control group, we compared the observed employment rates in our study to the current national rates. We also explored whether trainee engagement in VR-JIT was associated with perceived acceptability of the tool.
Methods
This study was reviewed by the University of Michigan Institutional Review Board and designated as exempt human subjects research.
Recruitment
School-level recruitment.
Our team approached the Illinois Division of Rehabilitation Services (DRS) and Michigan Rehabilitation Services (MRS) administrations with an opportunity to evaluate VR-JIT effectiveness and implementation in transition-age youth receiving federally mandated special education pre-employment transition services in secondary and postsecondary educational settings. The Illinois DRS administrators recommended partnering with the Illinois Secondary Transitional Experience Program (STEP; https://www.dhs.state.il.us/page.aspx?item=35174). Individual schools, school districts, regional educational cooperatives, and special-education therapy and learning centers are funded through contracts administered by DRS which is a part of the Illinois Department of Human Services.
The STEP network serves approximately 12,000 youth receiving secondary or postsecondary pre-employment transition services at approximately 700 schools. STEP provides classroom-based training on work readiness, self-advocacy, and disability awareness. STEP students receive DRS counseling during high school that includes help trying to find employment opportunities and on-the-job support and coaching. STEP students remain connected to DRS for vocational support after graduation. Non-STEP students receiving special education services also attend the classroom portion of STEP but do not have access to the additional DRS counseling supports. Following a recruitment meeting hosted by DRS, STEP coordinators representing 67 high schools (or educational cooperatives) agreed to learn more about the study before committing. After an in-depth review of the project, 44 STEP schools dropped out of the study, with STEP coordinators citing time commitment, competing priorities, and teachers declining participation as the reasons for dropout. The remaining 23 Illinois schools agreed to implement VR-JIT. After training the school partners, 9 additional schools dropped out, citing the time commitment to complete the research measures. Thus, 14 Illinois schools (n=9 public schools, n=5 public separate settings) completed the study.
The MRS administrators recommended partnering with Michigan Career and Technical Institute (MCTI; https://www.michigan.gov/mcti). MCTI is a MRS-sponsored public separate post-secondary transition program that delivers a standardized transition curriculum where students receive vocational and technical training in 13 trades (e.g., electronics, retail). MCTI combines classroom-based pre-employment, technical, and independent living skill training with a hands-on learning-by-doing approach. The 14 Illinois schools and MCTI completed VR-JIT implementation serving n=279 students during the 2017–2018 academic year.
Staff-level recruitment.
Administrative leaders (regional-level coordinators supervising the STEP curriculum; local special-education directors or chairs) approached all teachers (classroom teachers, teaching assistants, paraprofessionals) supporting transition students at each school to request that they participate in the study, telling the teachers that the choice was theirs and they could decline to have their classes participate.
The average age of the administrative leaders (n=10) was 46.1 years (SD=11.3); 83.3% were female. They had spent an average of 10.6 years (SD=6.0) teaching, and an average of 7.8 years (SD=9.8) working in transition services. The administrative leaders were 90% non-Latinx Caucasian females and 10% non-Latinx Caucasian males. The administrative leaders had master’s degrees (70%), bachelor’s degrees (20%), or some college (10%). Notably, five administrative leaders were also teachers who implemented VR-JIT.
The average age of the participating teachers (n=23) was 39.7 (SD=13.0); 68.4% were female. They had spent an average of 12.6 years (SD=8.3) teaching, and an average of 5.2 years (SD=3.2) working in transition services. The teachers were primarily female (95.7%) and Caucasian (95.7%), with 4.3% identifying as Latinx. The teachers had doctoral degrees (4.5%), master’s degrees (63.6%), bachelor’s degrees (27.3%), or some college (4.5%).
Student-level recruitment.
Each participating teacher in the partner school offered VR-JIT to all eligible students in pre-employment transition classes. Eligibility was defined as being age 16 to 21, receiving special education services, being enrolled in STEP, MCTI, or attending the STEP classes (i.e., non-STEP transition students), and being designated with one of the 13 disability categories according to IDEA (2004): autism; deaf-blindness; deafness; emotional disturbance; hearing impairment; intellectual disability; multiple disabilities; orthopedic impairment; other health impairment; specific learning disability; speech or language impairment; traumatic brain injury; and visual impairment (Individuals with Disabilities Education Act, 2004).
Students were eligible to participate whether they were currently employed or were not seeking jobs at the time of enrollment, as that status could change over the course of their reception of transition services, and the goal was to evaluate the real-world implementation of VR-JIT within transition services. We report the demographic, educational, and vocational characteristics for all student participants (n=279) in Table 1.
Table 1.
Participant characteristics
| Characteristics | Mean (SD) or percentage |
|---|---|
| Age | 18.6 (1.5 |
| Overall IQa | 77.6 (12.3) |
| Sex | |
| Male | 64.2% |
| Race | |
| Caucasian | 68.1% |
| Latinx | 15.8% |
| African American | 11.8% |
| More than one race | 4.3% |
| Grade Level | |
| Sophomore | 6.9% |
| Junior | 9.7% |
| Senior | 66.4% |
| Adult transition | 17.0% |
| Reading Level | |
| Less than 4th grade | 23.0% |
| 4th grade | 11.9% |
| 5th grade | 10.8% |
| 6th grade or higher | 54.3% |
| IDEA Categoryb | |
| Specific learning disability | 36.2% |
| Other health impairment | 28.7% |
| Autism | 25.1% |
| Emotional disturbance | 19.4% |
| Intellectual disability | 14.7% |
| Speech and language disability | 7.5% |
| Educational Locale | |
| City (small) | 4.7% |
| Suburb (large) | 64.2% |
| Suburb (small) | 5.4% |
| Town (fringe) | 9.3% |
| Town (remote) | 13.3% |
| Rural (fringe) | 3.2% |
n=166 students had available IQ data.
Percentages do not add up to 100% as students may fit into more than one category.
Virtual Reality Job Interview Training
Expanding on the description in the introduction, VR-JIT is grounded in behavioral learning principles (Cooper et al., 2007) and strategies to implement high-fidelity simulations (Motola et al., 2013). Moreover, VR-JIT includes three tiers of learning: a) Tier 1 is an e-learning curriculum reviewing appropriate social behaviors to engage in before, during, and after interviews. The e-learning also includes a review of eight job interview skills adapted from the Huffcutt (2011) job interview framework (e.g., coming across as a hard worker; highlighting that you work well on a team); b) Tier 2 is an online job application that trainees practice completing for a fictional company, Wondersmart, and application responses inform the interviewing algorithm; and c) Tier 3 is the virtual interview where trainees repetitively practice interviewing with “Molly Porter,” an actress who serves as a virtual hiring manager and has easy, medium, and hard difficulty levels. Trainees speak a response to Molly from a series of scripted responses (see Figure 1). Trainees receive three levels of feedback: real-time, nonverbal cues in response to their interview answers; a real-time transcript to provide feedback on their answers to Molly’s questions; and summary feedback on the eight job interview skills reviewed in the e-learning. Each virtual interview lasts approximately 25 minutes and trainees receive a non-normed 0–100 score after completion. Additional VR-JIT details are in Smith et al. (2014).
Figure 1.

VR-JIT Interface and Molly Porter
Procedures
VR-JIT was designed to be a scalable, easy-to-use, Internet-delivered, individualized learning experience that teachers could help students learn to use independently, with minimal supervision. The following strategies were used to support teachers to implement VR-JIT with fidelity.
Teacher training.
The research team led a mandatory 60-minute orientation session with teachers (via videoconference) on how to use VR-JIT. The research team provided each teacher with a copy of a fidelity checklist, adapted from prior VR-JIT studies (Smith et al., 2014). The research team oriented the teachers on how to use the checklist to navigate the e-learning curriculum with their students (e.g., reviewing the eight learning goals), complete an online job application, and interact with and obtain feedback from the virtual interviewer. The teachers then spent 1.5 hours interacting with VR-JIT to obtain practical knowledge of the tool, which was monitored and validated by the research team using the VR-JIT administrative portal (i.e., a website allowing the administrators to monitor teacher and student engagement (e.g., scores on virtual interview, minutes with virtual interviewer) with VR-JIT). After the videoconference was completed, teachers completed two role-plays with a peer to practice using the fidelity checklist (i.e., one role-play as the teacher and one role-play as the student). After the role-play, teachers were asked to confirm their readiness to implement VR-JIT by informing their school-level supervisor. Teachers were encouraged to complete additional role-plays if they reported a lack of readiness; however, all teachers reported feeling prepared to deliver VR-JIT. School-level supervisors reported role-play completions and teacher readiness to the research team.
VR-JIT implementation fidelity.
Teachers (n=26; including n=4 leaders who also served as teachers) self-reported their fidelity at implementing VR-JIT. First, teachers looked at the checklist to identify which aspect of VR-JIT they needed to teach first, second, third etc. Second, the teachers taught students to use VR-JIT in the order presented on the checklist. Third, the teachers checked the box on the checklist reflecting each aspect of VR-JIT that they taught (of note, n=1 teacher and n=1 leader also serving as a teacher did not return a completed fidelity checklist). Teachers either completed an online version of the checklist during the orientation with students or they scanned the printed version of their checklist and emailed it to the research team. The research team reviewed the 36 checklists submitted and 31 of 36 (86.1%) checklists were completed with high fidelity (at least 90% of boxes checked), two were completed with moderate fidelity (at least 80% of boxes checked), and three were completed with poor fidelity (less than 75% of boxes checked). Two teachers with poor-fidelity results were retrained, and their subsequent checklists achieved greater than 90% fidelity. One teacher with poor fidelity chose not to be retrained and did not orient additional students to VR-JIT (but continued to support students who had been trained).
Recommended VR-JIT curriculum.
In the efficacy trials evaluating VR-JIT delivery, trainees completed an average of 15 virtual interviews that were associated with improved interview performance and increased access to jobs (Smith et al., 2014). Thus, we recommended that our school partners encourage students to complete this same number of virtual interviews while the teachers monitored the students’ progress through the easy, medium, and hard levels of difficulty. The teachers monitored whether students improved their scores and mastered the easy interviews before progressing to medium (and then to hard) interviews. Based on a real-world effectiveness study of VR-JIT among adults with a range of disabilities (Smith et al., 2019), we recommended that students complete approximately three 45- to 60-minute VR-JIT sessions per week over four to six weeks to complete the targeted 15 interviews. The efficacy studies of VR-JIT did not provide insight into the optimal amount of time trainees should spend on the e-learning and job-application completion. Thus, we advised teachers to focus at least one session on reviewing the e-learning curriculum, and then we naturalistically observed the degree to which students engaged with this material via minutes spent on the e-learning web pages.
We encouraged teachers to develop their own strategies if they found that the recommended delivery plan was not feasible. Specifically, we encouraged them to adapt the recommended delivery plan to fit the context of their everyday teaching duties and asked them to note and then formally report these adaptations.
Study Measures
All teacher- and leader-focused research data were captured via electronic surveys. The surveys for leaders and teachers were sent via REDCap (Harris et al., 2009), an online data capture tool compliant with the Family Education Rights and Privacy Act of 1974. For surveys requiring student completion (i.e., VR-JIT acceptability and usability), the research team sent electronic survey links to teachers via an online data-capture system (Qualtrics, 2005), who then forwarded the links to students. No personal identifying information about students was collected. We used the National Center for Education Statistics (NCES) locale framework to group schools based on population size and U.S. Census Bureau definitions. We verified school names and addresses and then entered the addresses into the NCES Search for Public Schools database to determine the locale subtype for each school using 2016–2018 school-year data. The NCES locale subtypes include City–Large, Midsize, or Small; Suburban–Large, Midsize, Small; Town–Fringe, Distant, Remote; Rural–Fringe, Distant, Remote.
Process measures.
VR-JIT automatically captures the total number of completed virtual interviews (at each difficulty level), the overall highest score attained (range: 0 to 100), the amount of time (in minutes) trainees spend talking with the virtual interviewer, and the amount of time (in minutes) engaged in the e-learning curriculum. Teachers could review the students’ progress through their administrative access on the SIMmersion website, and the research team sent progress reports to the teachers. To reduce the number of contrasts and reduce measurement error, we used principal-components factor analysis with these four variables to create a composite variable of “engagement with VR-JIT.” Results of the principal-components factor analysis (no rotation) indicated a good fit to a one-factor solution (all component values ranged between 0.457 and 0.924). Intercorrelations among these four variables support the creation of a composite with a range of Pearson’s r=0.23 to 0.86 (p-values all <0.001; except the total score and e-learning minutes at r=0.10, p=0.089). A table of intercorrelations between all variables used for this article are available from the first author by request. Factor scores computed by SPSS 26.0 reflect “VR-JIT engagement.”
Implementation evaluation measures.
Our implementation evaluation involved measuring core determinants, processes, and outcomes germane to hybrid Type 3 effectiveness-implementation trials within the Proctor et al. (2011) implementation research outcome domains taxonomy. Specifically, we evaluated the implementer perspective of VR-JIT orientation and initial training acceptability, VR-JIT appropriateness and expected implementation feasibility, and post-implementation VR-JIT acceptability and sustainability. We also assessed prospective delivery and teacher context and adaptation based on Stirman and colleagues’ adaptation coding taxonomy (Stirman et al., 2017; Stirman et al., 2013). Lastly, we adapted the treatment acceptability rating form (Reimers & Wacker, 1988) to evaluate student-level acceptability of VR-JIT, and adapted the system usability scale (Brooke, 1986) to evaluate student-level usability of VR-JIT.
Prospective delivery.
Although we recommended that students complete approximately three 45- to 60-minute VR-JIT sessions per week over the course of four to six weeks, we supported each transition program in adapting the a priori plan to achieve this target. We developed a prospective delivery survey for leaders to report on where, when, how, and how often they planned to deliver VR-JIT. The survey items were developed using the Stirman adaptation coding taxonomy (Stirman et al., 2017; Stirman et al., 2013). The survey included seven items (e.g., “Where will teachers deliver VR-JIT?” “When will teachers deliver VR-JIT?” “How many interviews do you expect students to complete each week?”).
Teacher context and adaptation.
Our teacher context and adaptation survey used the Stirman adaptation coding taxonomy (Stirman et al., 2017; Stirman et al., 2013) for teachers to report on the context of delivery and the strategies actually used during the delivery. The teachers completed this survey after the first two weeks of VR-JIT implementation and after completing VR-JIT implementation. The survey included six items evaluating the delivery context (e.g., “What level of assistance did students need to complete the training?” “What other transition services did your students receive?”). The survey included four items evaluating adaptation (e.g., “Where was VR-JIT delivered?” “When was VR-JIT delivered?” “How was VR-JIT delivered?”).
VR-JIT orientation acceptability.
This survey evaluated the acceptability of the orientation we provided to train teachers and leaders to orient students on how to use VR-JIT. The survey included seven items rated on a scale from 0 (not at all) to 4 (very). Sample items reflected satisfaction with orientation; satisfaction with opportunity to practice VR-JIT; feeling prepared to teach VR-JIT to students; and acceptability of orientation material. Internal consistency was high (α0.94).
VR-JIT appropriateness and expected implementation feasibility.
We evaluated the extent to which teachers and leaders perceived VR-JIT as an appropriate tool for inclusion in transition services prior to implementation. The survey included five items rated on a scale from 0 (not at all) to 4 (very) that were summed for a total score. Sample items asked “How well do you think VR-JIT fits with students’ goals for job training?” “How likely do you think it is that students will be engaged in VR-JIT?”. Internal consistency was high (α=0.83). We evaluated the confidence teachers and leaders had in the expected feasibility of implementing VR-JIT, using the total score from nine items rated on a scale from 0 (not at all) to 4 (very). Sample items asked “How confident are you that you will deliver VR-JIT with fidelity and effectiveness?” “How confident are you that you will be able to support students to use VR-JIT after training them?”. Internal consistency was high (α=0.84).
VR-JIT acceptability and sustainability (post-implementation).
We evaluated teacher-level acceptability of VR-JIT as an intervention to be delivered within pre-employment transition services. This survey included 10 items on a scale from 0 (not at all) to 3 (very) to assess acceptability (e.g., satisfaction with VR-JIT as a service, satisfaction with technology support, acceptability of VR-JIT content, how VR-JIT fit with transition services). Internal consistency was high (α=0.83). Administrators and teachers evaluated the sustainability of delivering VR-JIT with three items on a scale from 0 (not at all) to 3 (very). Sample items reflected motivation to continue VR-JIT and the school being equipped to continue VR-JIT. Internal consistency was acceptable (α=0.68). We computed total scores for each scale.
We evaluated student-level acceptability using the total score from a five-item self-report scale from 1 (e.g., very unenjoyable) to 5 (very unenjoyable). Sample items asked “How enjoyable was the VR-JIT training?” and “How helpful was the VR-JIT tool in preparing you for an interview?” Internal consistency was acceptable (α=0.73). We evaluated student-level usability using the total score from a seven-item self-report scale from 1 (not at all) to 4 (very much). Sample items reflected “Was it easy for you to pay attention when learning VR-JIT?” and “Do you think you are good at using VR-JIT?” Internal consistency was acceptable (α=0.74).
Effectiveness measures.
We designed a survey in which teachers used all available student records to capture data related to the students’ demographics, cognitive ability (IQ, reading level), and employment history (at baseline and at three- and six-month follow-ups). The IQs reported by the teachers were generated via the Wechsler Intelligence Scale for Children V (Wechsler, 2014) or the Woodcock Johnson IV Tests of Cognitive Abilities (Schrank et al., 2014). The teachers reported whether students were currently employed (at baseline) or whether they sustained unemployment, became unemployed, obtained new jobs, maintained their jobs, or lost their jobs since the prior assessment period (at the three- and six-month follow-ups).
“Employment” or “a job” reflected a paid position in the community that was not set aside for someone with a disability (i.e., competitive, integrated employment). We used these data to code the primary outcome variable as: “0” for youth who either remained unemployed between baseline and follow-up or were employed at baseline, then became unemployed and remained unemployed at follow-up; “1” for youth (who were either unemployed or employed at baseline) who obtained new jobs between baseline and follow-up; and “2” for youth who were employed at baseline and maintained that employment through follow-up. Teachers completed three- and six-month follow-ups for 250 youth; 23 youth had a three-month follow-up only; one youth had a six-month follow-up only; and four youth did not have follow-up data. Thus, we obtained follow-up data on 275 of 279 youth (98.6%). For youth where teachers only reported the three-month follow-up, we carried forward their outcomes to represent their final outcomes at six month follow-up.
Data Analysis
Given the variability in transition program type (STEP vs. non-STEP vs. MCTI), we conducted design-effect analyses on our employment-outcome variable to determine the amount of variance corresponding to the nesting within school locale and transition programming type. This variation could affect estimates of the standard error and require multilevel analytic approaches when significant. Muthen and Satorra (1995) specify that design-effect test statistics that are less than or equal to 2.0 suggest the presence of nonsignificant variation accounted for the nested data structure, and therefore do not require multilevel analyses. Both school locale and program type had design-effect estimates below 2.0. Thus, we did not include the multilevel nature of the study design in our analyses.
We used descriptive analyses to characterize the process outcomes (i.e., VR-JIT performance), demographic and cognitive characteristics, and youths’ employment history. To evaluate our implementation outcomes, we report the descriptive statistics (i.e., mean, standard deviations, range) of VR-JIT implementation (prospective delivery; teacher context and adaptation), acceptability, expected implementation feasibility, and sustainability. We also conducted paired-sample t-tests to evaluate whether there were differences between implementation strategies at delivery midpoint and endpoint.
To analyze VR-JIT effectiveness, we conducted a multinomial logistic regression to evaluate the student’s new employment (compared to unemployment) and sustained employment (compared to unemployment). The model focused on VR-JIT engagement as the primary independent variable and included biological sex, overall IQ, and grade level as covariates as they are known contributors to employment (Power et al., 2008; Southward & Kyzar, 2017; Wehmeyer & Palmer, 2003). We evaluated the regression model for the presence of multicollinearity, and all variance-inflation factors were observed below 2.0. Exploratory associations were conducted using Pearson’s correlations with two-tailed tests.
We observed that more than 39.1% of the sample was missing overall IQ data. In an effort to control for overall IQ (known to be a significant predictor of employment), we imputed data using the expectation-maximization algorithm (Dempster et al., 1977), which is a maximum likelihood estimation method that generates unbiased estimates when data are missing completely at random (MCAR). Using (Little, 1988) MCAR test, we observed that the IQ data were MCAR χ2 (7)=8.75, p=0.27, and the imputation of the missing data did not introduce bias into the analyses.
Results
VR-JIT Process Outcomes
We observed heterogeneity in the students’ engagement with VR-JIT. Although all 279 students completed at least 1 virtual interview, we observed that 26.2% (n=73) completed 1 to 5 virtual interviews; 43.0% (n=120) completed 6 to 14 virtual interviews, and 30.8% (n=86) of students completed the recommended 15 virtual interviews. The total mean completions were 10.8 (SD=7.4; range: 1 to 37 interviews) with a mean completion score of 77.5 (SD=14.1 out of 100 points) and a mean high score of 90.6 (SD=11.8). We observed that students completed a mean of 22.1 (SD=25.3) minutes using the e-learning curriculum (range: 0 to 194 minutes), and a mean of 198.7 (SD=130.6) minutes of virtual interviews (range: 14 to 676 minutes).
VR-JIT Implementation Outcomes
VR-JIT orientation acceptability, appropriateness, and expected implementation feasibility (pre-implementation).
Administrative leaders (n=9) and teachers (n=41, including n=12 teachers or support staff who completed orientation but did not lead VR-JIT implementation) reported the VR-JIT orientation was acceptable (M=21.56, SD=4.82; range: 0–28) and that VR-JIT was appropriate for transition services (M=16.02, SD=2.60; range: 0–20). In addition, teachers and administrative leaders expected that implementation of VR-JIT would be feasible in their programs (M=24.55, SD=4.62; range: 0–36). One teacher who served 36 students had an outlying total score on the expected feasibility subscale of 2, which was removed. Ranges reflect possible scores.
Prospective implementation.
Administrative leaders (n=10) revealed that the planned primary implementation location was the school, with 40% of leaders reporting that there would be no secondary implementation location, 30% reporting that a secondary implementation location would be an external job-training site or public place (such as a library), and 30% reporting that a secondary implementation location would be the student’s home. Within the school setting, 70% of leaders planned to implement VR-JIT within the transition classroom, 20% during study hall, and 10% during homeroom. Within the classroom setting, 90% of leaders planned to implement VR-JIT in a group setting where students had their own computing devices (e.g., tablet, laptop, desktop), while 10% planned to do so individually in a private or semiprivate setting. Leaders noted that a planned secondary strategy was to implement VR-JIT with students individually in a private or semiprivate room (50%) or with students in a single group setting with a single device (30%), while 20% had no secondary delivery strategy. Finally, 90% of leaders expected: 1) teachers to adhere to this strategy to a large or very large degree; 2) supported teachers having occasional-to-moderate freedom to adapt this strategy to ease delivery, and 3) expected students to complete 1 to 4 virtual interviews per week, as suggested by the research team.
Teacher context and adaptation.
Overall, 26 teachers (including 4 administrative leaders who served as teachers) supported a mean of 13.56 (SD=8.11) students using VR-JIT. Over the course of training, teachers reported on the delivery context and implemented adaptations at the midpoint of training and again at the endpoint of training. In addition, one teacher taught two cohorts of students with differing implementation strategies and completed the surveys for each cohort (of note, n=1 teacher and n=1 administrative leader serving as a teacher did not complete these surveys). We did not observe any statistical differences using paired-sample t-tests between delivery strategies at midpoint and endpoint (all p >0.10). Thus, we present the means between midpoint and endpoint. As a result, means within categories may not add up to 100%. Teachers reported that 24.1% of students needed no guidance when using VR-JIT, 50.8% of students needed a little or some guidance, and 20.7% of students needed a lot of guidance (e.g., discussing the feedback on the transcripts or why a response received a negative reaction by the virtual coach). Teachers also reported that 86.8% of students used VR-JIT at school, while 11.3% used VR-JIT at home, at a job placement, or in another setting.
Within schools, teachers reported that 80.5% of students used VR-JIT during transition class, and 16.7% of students used VR-JIT during homeroom, study hall, after-school programming, or free periods. Further, teachers reported that 72.5% of students used VR-JIT in group settings with their own devices, 22.3% used VR-JIT in private or semiprivate rooms with their own devices, and 3.7% used VR-JIT in group settings with a single device. Teachers reported that 52% of students completed one to four virtual interviews per week, 39.6% of students completed five or more virtual interviews per week, and 4.9% of students completed fewer than one interview per week. Teachers reported that most students were receiving some level of typical transition services concurrently with VR-JIT. Specifically, 71.2% of students were working on job-skill development, 58.6% were working on resumes, 40.2% were mock-interviewing with teachers, and 12.3% were mock-interviewing with community employers.
VR-JIT acceptability and sustainability (postimplementation).
Postimplementation, administrative leaders (n=10) and teachers (n=21) reported that VR-JIT was highly acceptable (M=25.64, SD=3.86; range: 0–30). In addition, administrative leaders (n=9) and teachers (n=15) reported that VR-JIT implementation would be sustainable (M=7.79, SD=1.35; range: 0–9). Students (n=115) reported that VR-JIT was acceptable (M=19.00, SD=3.26; range: 5–25) and usable (M=19.48, SD=3.77; range: 7–28).
VR-JIT Effectiveness Outcomes
We observed that 133 students remained unemployed (48.4%) between baseline and follow-up, 90 students obtained new jobs (32.7%) between baseline and follow-up, and 52 students sustained jobs between baseline and follow-up (18.9%). Among unemployed students, twenty-three (17.3%) obtained either a paid or unpaid internship by the six-month follow-up.
For the multinomial logistic regression (see Table 2), the likelihood ratio test for model fit was significant (χ2 (12)=47.1, p<0.001; Nagelkerke R2 =0.180). The results suggested that compared to students who were unemployed at follow-up, those who obtained competitive, integrated jobs were more engaged in VR-JIT (OR=1.63, p=0.002) and more likely to have higher overall IQs (OR=1.06, p<0.001). Sophomores (OR=0.07, p=0.001), juniors (OR=0.16, p=0.007), and seniors (OR=0.42, p=0.035) were less likely to obtain jobs than ‘super seniors’ who had completed their senior year and were engaged in pre-diploma transition services.
Table 2.
Multinomial Logistic Regression Results
| Outcome variables | Predictor variables | B | SE | Odds Ratio | 95% confidence interval | |
|---|---|---|---|---|---|---|
| Obtained new employment by six-month follow-up | VR-JIT engagement | 0.49 | 0.16 | 1.63** | 1.19 | 2.22 |
| Overall IQ | 0.05 | 0.01 | 1.06*** | 1.03 | 1.08 | |
| Sexa | 0.22 | 0.31 | 1.24 | 0.68 | 2.27 | |
| Sophomoreb | −2.74 | 0.86 | 0.07*** | 0.01 | 0.35 | |
| Juniorb | −1.82 | 0.68 | 0.16** | 0.04 | 0.61 | |
| Seniorb | −0.87 | 0.41 | 0.42* | 0.19 | 0.94 | |
| Sustained baseline employment through six-month follow-up | VR-JIT engagement | −0.01 | 0.19 | 0.99 | 0.69 | 1.44 |
| Overall IQ | 0.05 | 0.02 | 1.05** | 1.02 | 1.08 | |
| Sexa | 0.30 | 0.36 | 1.35 | 0.67 | 2.73 | |
| Sophomoreb | −2.04 | 0.89 | 0.13** | 0.02 | 0.74 | |
| Juniorb | −0.76 | 0.64 | 0.47 | 0.13 | 1.63 | |
| Seniorb | −0.54 | 0.48 | 0.58 | 0.23 | 1.48 | |
Note. VR-JIT = virtual reality job interview training.
Females as reference group.
Students receiving pre-diploma transition services after completing senior year as reference group.
Significant p < 0.05.
Significant p < 0.01.
Significant p < 0.001.
We observed that students who sustained jobs from baseline were more likely to have higher overall IQs (OR=1.05, p=0.002) and less likely to be sophomores (OR=0.13, p=0.021). Engaging in VR-JIT, biological sex, and being a junior or senior were not related to the likelihood of sustaining jobs (all p >0.10).
Post Hoc Analyses and Results
Although school partners implemented VR-JIT for all their transition students, n=82 youth were not actively seeking jobs. We used chi-square analyses to compare the employment rates for youth who were unemployed at baseline and were either seeking jobs or not seeking jobs. We observed that 45.8% of youth who were unemployed and seeking jobs at baseline had obtained jobs by the six-month follow-up, compared to the 22.0% of youth who were unemployed and not seeking jobs at baseline who had obtained jobs by the six-month follow-up (χ2(1)=11.53, p<0.001).
We then evaluated the relationship between students’ VR-JIT performances and their post-implementation assessments of VR-JIT acceptability and usability. Pearson correlations revealed that the latent variable representing VR-JIT engagement (total number of virtual interviews completed, high score, number of minutes with e-learning, and number of minutes talking to Molly) were significantly correlated with the total scores representing VR-JIT acceptability (r=0.23, p=0.014) and usability (r=0.19, p=0.041).
Discussion
This Type 3 hybrid effectiveness-implementation trial aimed to evaluate the effects of VR-JIT on employment among students in STEP programs. The primary focus being on VR-JIT implementation reflects the multitude of challenges faced when translating research evidence to more “real-world” contexts. We observed that teachers and administrators found the implementation of VR-JIT feasible and appropriate; that training for teachers was acceptable; that teachers implemented VR-JIT with fidelity and minimal adaptation; and that teachers reported that VR-JIT was acceptable and sustainable. Most program administrators planned to implement VR-JIT in schools during transition class time, with students using their own devices to engage the tool. Thus, our guided yet flexible approach to implementation is consistent with current perspectives from the field of implementation science. That is, a recognition that adaptation to the way interventions are implemented is not only inevitable, but a necessary process for sustainability (Chambers et al., 2013). This does not, however, suggest that implementation occurs without guidance or parameters, but that flexibility of implementation is afforded prospectively to aspects of delivery that are not expected to impact the core functions of the intervention responsible for its observed effects in prior research (Perez Jolles et al., 2019). The methods by which delivery occurs are allowed to vary, but the functions that must be achieved to ensure fidelity are maintained. Thus, the observed relationship between VR-JIT engagement and employment in this study attest to having maintained its core functions.
We observed that teachers primarily used the tool with fidelity as specified a priori when they were successful in completing and submitting the fidelity checklists to the research team for review. The teachers implemented VR-JIT with minimal adaptations, reflecting that approximately 10% more students than anticipated used VR-JIT during transition class time. Another adaptation reflected that approximately 12% fewer students used VR-JIT in group settings with their own devices than anticipated; instead, they used VR-JIT by themselves in private settings. Teachers and leaders reported that they found VR-JIT to be an acceptable and appropriate tool for transition services, and that the use of VR-JIT would likely be sustainable within their programs. Notably, teachers and administrators reported that the ongoing financial costs of VR-JIT (55.6%) and training to deliver VR-JIT (44.4%) might be barriers to sustainability. However, 77.8% of teachers and administrators reported that future implementation of VR-JIT would be a priority. Despite these noted challenges to sustaining VR-JIT, teachers and administrators wanted to continue providing this intervention to transition-age youth receiving special education services and expected they could sustain its use over time. Presumably teachers and administrators perceived that VR-JIT was a benefit to both students and the school. However, additional evaluation of the precise nature of these perceived benefits is needed to understand the multilevel considerations in the decision to sustain, which is a complex process for delivery systems such as schools. At this time the evaluation of VR-JIT sustainability has been understudied, but efforts are ongoing to improve measurement and planning tools (Calhoun et al., 2014; Palinkas et al., 2019).
We observed that 32.7% of transition age youth receiving special education pre-employment transition services who engaged with VR-JIT were employed by six-month follow-up (compared to a national rate of 18.4% among youth with disabilities (Bureau of Labor Statistics, 2020). We also observed that these same youth seeking jobs at study entry and youth who were not seeking jobs at study entry had higher employment rates (45.8% and 22.0%, respectively) than the national average after using VR-JIT. Moreover transition age youth who engaged with VR-JIT were 1.63 times more likely to be employed by six month follow-up after controlling for factors related to employment such as biological sex, IQ, and grade level (Carter, Ditchman, et al., 2010; Newman et al., 2011; Power et al., 2008; Southward & Kyzar, 2017; Wehmeyer & Palmer, 2003). These initial results suggest that VR-JIT engagement explains significant variation in employment outcomes for youth receiving special education pre-employment transition services. However, we emphasize caution when interpreting the results as we could not compare employment rates between youth in our study and youth who did not use VR-JIT in similar settings.
Additionally, students reported VR-JIT to be acceptable and usable and those with stronger metrics of VR-JIT engagement reported higher levels of acceptability and usability for the tool. Importantly, the association between VR-JIT engagement and acceptability observed in our sample will be critical for successfully implementing VR-JIT in future settings as intervention acceptability by end users significantly predicts intervention effectiveness (Elliott, 2017).
Implications and Future Directions
This study is forward-thinking in its use of a Type 3 hybrid effectiveness-implementation design and focus on the implementation processes and outcomes of delivering VR-JIT in schools for transition age youth in need of employment. Specifically, we observed VR-JIT implementation feasibility, and teacher and student acceptability, which suggest there is strong potential for uptake of VR-JIT as a means of enhancing transition services. Of particular importance is the potential instructional relief that VR-JIT could provide teachers who transition from one-on-one role-play models to using VR-JIT. The prevalence of technology in schools in the U.S. suggests VR-JIT could be a potentially cost-effective solution for schools (Gray et al., 2010). Also, our findings suggest that providing prospective guidance on acceptable means of implementing VR-JIT while also allowing for adaptations to the delivery approach may have helped enhance employment outcomes for students and was highly acceptable, feasible, and sustainable. Moreover, we observed these facilitating factors of future implementation despite the presence of the considerations of costs and time needed to train school staff that are challenges for future scale up and sustainability.
Although this initial study suggests that VR-JIT may be an effective tool that it is feasible to implement with strong support for sustainability, there are critical areas for future research. First, the current study was intentionally designed as a non-controlled and non-randomized study to meet the needs of our school partners, who suggested that a controlled trial may lack feasibility to conduct and could be unethical to withhold from their students given the established efficacy of VR-JIT. That said, the results suggest VR-JIT may have incidence validity (i.e., the potential to impact large numbers of people) given its potential for scalability and impact validity (i.e., the potential for serious and enduring consequences) given the potential effects on employment. Thus, both validity types are critical requirements of evidence-based practice in special education and support further evaluation of VR-JIT in a future randomized controlled trial to validate the effectiveness of VR-JIT (Gersten et al., 2005). Second, although we observed that VR-JIT engagement predicted greater employment by the six-month follow-up, the potential of a VR-JIT dose response must be evaluated to help clarify whether there is an optimal dose associated with improved employment. Third, future research is needed to evaluate the differential effects that VR-JIT may have on employment across the individual IDEA categories. Fourth, although VR-JIT may be effective and have strong implications for implementation, future research is needed to evaluate whether schools can deliver it in a cost-effective manner. Lastly, the potential for expanding the use of VR-JIT to the district, regional, or state level requires additional research, likely with a primary focus on the economic model needed to sustain delivery.
Limitations
Although there is evidence that VR-JIT may help enhance pre-employment transition services, study limitations must first be discussed. First, we evaluated the presence of transition services delivered concurrently with VR-JIT at the teacher level. Although our design analysis suggests that school and student type (e.g., STEP vs. non-STEP) do not account for the observed differences in delivery of transition services, future studies could explore the relative contribution of these specific transition-service components. Second, although the race and ethnicity of our youth represented the demography of Illinois and Michigan, future studies of VR-JIT could be strengthened by recruiting a larger sample of youth from underrepresented communities. Third, participating youth were primarily from rural, town, and suburban schools, so our findings have limited generalizability to schools and youth in large city locales. Fourth, approximately 70% of the schools we approached declined to participate (or didn’t respond to recruitment solicitations). Moreover, we observed that the schools who declined to participate cited competing priorities and potential teacher burden (completing research documentation) as reasons for the decline. Thus, our sample has limited generalizability to schools who may be under-resourced to participate in a large-scale evaluation. Fifth, teachers self-reported their own fidelity checklist, although ideally, this checklist would be completed by an independent observer. Sixth, the 18.4% employment rate for youth with disabilities reported by the Bureau of Labor Statistics (2020) focuses on parent- or self-identification of a physical, mental, or emotional condition and may not represent all IDEA categories. Lastly, we did not evaluate the quantity of in-person interview role-plays youth completed with teachers and cannot evaluate this as a covariate in our statistical models. Anecdotally, teachers reported that students completed 0 to 2 job interview role-plays during their pre-employment training. Also, teachers did not receive standardized training on how to conduct job interview role-plays.
Conclusions
This study provides promising evidence that VR-JIT may help enhance the effects of transition services at increasing employment and can be feasibly implemented in school settings with minimal adaptations. Several results from this study suggest that VR-JIT is emerging as a potentially effective, readily scalable, feasibly delivered, and sustainable tool that is highly acceptable to administrators, teachers, and students. However, we temper our enthusiasm by recognizing that this trial was not controlled; a randomized, controlled trial will be needed to validate the effectiveness of VR-JIT. Future studies will need to consider the costs of VR-JIT implementation and long-term sustainability.
Acknowledgements:
This study was funded by the Kessler Foundation (1003-1958-SEG-FY2016, PI: Matthew Smith). Marc Atkins was supported by the National Center for Advancing Translational Sciences, National Institutes of Health, through Grant UL1TR002003. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Footnotes
Disclosures:
Dr. Matthew Smith will receive royalties on sales of an adapted, unpublished (at the time of this submission) version of virtual reality job interview training that will focus on meeting the needs of transition-age youth with autism spectrum disorders. Dr. Smith’s research on the effectiveness of the adapted version of VR-JIT is independent of the data reported in this manuscript that reports on the original version of VR-JIT. No other authors report any conflicts of interest.
References
- Arter P, Brown T, Law M, Barna J, Fruehan A & Fidiam R (2018). Virtual Reality: Improving Interviewing Skills in Individuals with Autism Spectrum Disorder. In Langran E & Borup J (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 1086–1088). Washington, D.C., United States: Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/182660/ [Google Scholar]
- Baker-Ericzen MJ, Brookman-Frazee L, & Brodkin ES (2018). Accelerating research on treatment and services for transition age youth and adults on the autism spectrum. Autism, 22(1), 2–5. 10.1177/1362361317738646 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bellani M, Fornasari L, Chittaro L, & Brambilla P (2011). Virtual reality in autism: state of the art. Epidemiol Psychiatr Sci, 20(3), 235–238. 10.1017/s2045796011000448 [DOI] [PubMed] [Google Scholar]
- Brooke J (1986). System Usability Scale (SUS): a “quick and dirty” usability scale. In Jordan PW, Thomas BA, & Weerdmeester AL (Eds.), Usability Evaluation in Industry. Taylor and Francis. [Google Scholar]
- Bureau of Labor Statistics (2020). Persons with a disability: Labor force characteristics 2019. (USDL-20–0339). Washington, D.C. Retrieved from https://www.bls.gov/news.release/pdf/disabl.pdf [Google Scholar]
- Burke SL, Bresnahan T, Li T, Epnere K, Rizzo A, Partin M, Ahlness RM, & Trimmer M (2018). Using Virtual Interactive Training Agents (ViTA) with Adults with Autism and Other Developmental Disabilities. J Autism Dev Disord, 48(3), 905–912. 10.1007/s10803-017-3374-z [DOI] [PubMed] [Google Scholar]
- Calhoun A, Mainor A, Moreland-Russell S, Maier RC, Brossart L, & Luke DA (2014). Using the Program Sustainability Assessment Tool to assess and plan for sustainability. Prev Chronic Dis, 11, 130185. 10.5888/pcd11.130185 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Campion MA, & Campion JE (1987). Evaluation of an interviewee skills traning program in a natural field experiment. Personnel Psychology, 40, 675–691. 10.1111/j.1744-6570.1987.tb00619.x [DOI] [Google Scholar]
- Carter EW, Ditchman N, Sun Y, Trainor AA, Swedeen B, & Owens L (2010). Summer employment and community experiences of transition-age youth with severe disabilities. Exceptional Children, 76(2), 194–212. 10.1177/001440291007600204 [DOI] [Google Scholar]
- Carter EW, Trainor AA, Cakiroglu O, Swedeen B, & Owens LA (2010). Availability of and access to career development activities for transition-age youth with disabilities. Career Development for Exceptional Individuals, 33(1), 13–24. 10.1177/0885728809344332 [DOI] [Google Scholar]
- Chambers DA, Glasgow RE, & Stange KC (2013). The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci, 8, 117. 10.1186/1748-5908-8-117 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cobb RB, Lipscomb S, Wolgemuth J, Schulte T, Veliquette A, Alwell M, Batchelder K, Bernard R, Hernandez P, Holmquist-Johnson H, Orsi R Sample McMeeking L, Wang J, and Weinberg A (2013). Improving Post-High School Outcomes for Transition-Age Students with Disabilities: An Evidence Review (NCEE 2013–4011). Washington, DC: Institute of Educational Science; Retrieved from https://ies.ed.gov/ncee/pubs/20134011/pdf/20134011.pdf [Google Scholar]
- Connor A, Sung C, Strain A, Zeng S, & Fabrizi S (2019). Building Skills, Confidence, and Wellness: Psychosocial Effects of Soft Skills Training for Young Adults with Autism. J Autism Dev Disord. 10.1007/s10803-019-03962-w [DOI] [PubMed] [Google Scholar]
- Cooper JO, Heron TE, & Heward WL (2007). Applied Behavioral Analysis. Pearson. [Google Scholar]
- Curran GM, Bauer M, Mittman B, Pyne JM, & Stetler C (2012). Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50(3), 217–226. doi: 10.1097/MLR.0b013e3182408812 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dempster A, Laird N, & Rubin DB (1977). Maximum likelihood from incomplete data vai the EM algorithm. Journal of the Royal Statistical Society: Series B, 39(1), 1–38. 10.1111/j.2517-6161.1977.tb01600.x [DOI] [Google Scholar]
- Elksnin N, & Elksnin LK (2001). Adolescents with disabilities: the need for occupational social skills training. Exceptionality, 9, 91–105. 10.1080/09362835.2001.9666993 [DOI] [Google Scholar]
- Elliott SN (2017). The social validity of “Acceptability of behavioral interventions used in classrooms”: inferences from longitudinal evidence. Behavioral Disorders, 43(1), 269–273. 10.1177/0198742917739021 [DOI] [Google Scholar]
- Gadelha R (2018). Revolutionizing Education: The Promise of Virtual Reality [Academic Journal Report]. Childhood Education, 94(1), 40–43. 10.1080/00094056.2018.1420362 [DOI] [Google Scholar]
- German RE, Adler A, Frankel SA, Stirman SW, Pinedo P, Evans AC, Beck AT, & Creed TA (2018). Testing a Web-Based, Trained-Peer Model to Build Capacity for Evidence-Based Practices in Community Mental Health Systems. Psychiatr Serv, 69(3), 286–292. 10.1176/appi.ps.201700029 [DOI] [PubMed] [Google Scholar]
- Gersten R, Fuchs LS, Compton D, Coyne M, Greenwood C, & Innocenti MS (2005). Quality Indicators for Group Experimental and Quasi-Experimental Research in Special Education. Exceptional Children, 71(2), 149–164. 10.1177/001440290507100202 [DOI] [Google Scholar]
- Gray L, Thomas N, and Lewis L (2010). Educational Technology in U.S. Public Schools: Fall 2008 (NCES 2010– 034). U.S. Department of Education, National Center for Education Statistics. Washington, DC: U.S. Government Printing Office. [Google Scholar]
- Gregg N, Galyardt A, Wolfe G, Moon N, & Todd R (2017). Virtual Mentoring and Persistence in STEM for Students with Disabilities. Career Development and Transition for Exceptional Individuals, 40(4), 205–214. 10.1177/2165143416651717 [DOI] [Google Scholar]
- Gresham FM, Sugai G, & Horner RH (2001). Interpreting Outcomes of Social Skills Training for Students with High-Incidence Disabilities. Exceptional Children, 67(3), 331–344. 10.1177/001440290106700303 [DOI] [Google Scholar]
- Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, & Conde JG (2009). Research electronic data capture (REDCap) - A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform, 42(2), 377–381. 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hayes GR, Custodio VE, Haimson OL, Nguyen K, Ringland KE, Ulgado RR, Waterhouse A, & Weiner R (2015). Mobile video modeling for employment interviews for individuals with autism. Journal of Vocational Rehabilitation, 43, 275–287. 10.3233/JVR-150775 [DOI] [Google Scholar]
- Huffcutt AI (2011). An Empirical Review of the Employment Interview Construct Literature. International Journal of Selection and Assessment, 19(1), 62–81. 10.1111/j.1468-2389.2010.00535.x [DOI] [Google Scholar]
- Hutchinson VD, Rehfeldt RA, Hertel I, & Root WB (2019). Exploring the efficacy of acceptance and commitment therapy and behavioral skills training to teach interview skills to adults with autism spectrum disorders. Advances in Neurodevelopmental Disorders, 3, 450–456. 10.1007/s41252-019-00136-8 [DOI] [Google Scholar]
- Individuals with Disabilities Education Act, Pub. L. No. PL 101–476, 20 U.S.C. § 1400 (2004). https://sites.ed.gov/idea/
- Interagency Autism Coordinating Committee (IACC). (2017). IACC Strategic Plan for Autism Spectrum Disorders Research. Washington, D.C.: U.S. Department of Health and Human Services. [Google Scholar]
- Jans LH, Kaye HS, & Jones EC (2012). Getting hired: successfully employed people with disabilities offer advice on disclosure, interviewing, and job search. J Occup Rehabil, 22(2), 155–165. https://doi.org/ 10.1007/s10926-011-9336-y [DOI] [PubMed] [Google Scholar]
- Lindsay S, McDougall C, Sanford R, Menna-Dack D, Kingsnorth S, & Adams T (2015). Exploring employment readiness through mock job interview and workplace role-play exercises: comparing youth with physical disabilities to their typically developing peers. Disabil Rehabil, 37(18), 1651–1663. 10.3109/09638288.2014.973968 [DOI] [PubMed] [Google Scholar]
- Little RJA (1988). A test of missing completely at random for multivariate data with missing values. Journal of American Statistical Association, 83, 1198–1202. DOI: 10.1080/01621459.1988.10478722 [DOI] [Google Scholar]
- Lorenz T, Frischling C, Cuadros R, & Heinitz K (2016). Autism and overcoming job barriers: comparing job-related barriers and possible solutions in and outside of autism-specific employment. PLOS One, 11(1), e0147040. 10.1371/journal.pone.0147040 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McHugh RK, & Barlow DH (2010). The dissemination and implementation of evidence-based psychological treatments. A review of current efforts. Am Psychol, 65(2), 73–84. 10.1037/a0018121 [DOI] [PubMed] [Google Scholar]
- McMahon DD, Barrio B, McMahon AK, Tutt K, & Firestone J (2019). Virtual Reality Exercise Games for High School Students With Intellectual and Developmental Disabilities. Journal of Special Education Technology. 10.1177/0162643419836416 [DOI] [Google Scholar]
- Mikropoulos TA, & Natsis A (2011). Educational Virtual Environments: A Ten-Year Review of Empirical Research (1999–2009) [Academic Journal Report]. Computers & Education, 56(3), 769–780. 10.1016/j.compedu.2010.10.020 [DOI] [Google Scholar]
- Modugumudi Y, Santhosh J, & Anand S (2013). Efficacy of collaborative virtual environment intervention programs in emotion expression of children with autism. Journal of Medical Imaging and Health Informatics, 3(2), 321–325. 10.1166/jmihi.2013.1167 [DOI] [Google Scholar]
- Morgan L, Leatzow A, Clark S, & Siller M (2014). Interview skills for adults with autism spectrum disorder: a pilot randomized controlled trial. J Autism Dev Disord, 44(9), 2290–2300. 10.1007/s10803-014-2100-3 [DOI] [PubMed] [Google Scholar]
- Motola I, Devine LA, Chung HS, Sullivan JE, & Issenberg SB (2013). Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach, 35(10), e1511–1530. 10.3109/0142159X.2013.818632 [DOI] [PubMed] [Google Scholar]
- Muthen BO, & Satorra A (1995). Complex sample data in structural equation modeling. Sociological Methodology, 25, 267–316. DOI: 10.2307/271070 [DOI] [Google Scholar]
- Newman L, Wagner M, Knokey AM, Marder C, Nagle K, Shaver D, & Wei X (2011). The post-high school outcomes of young adults with disabilities up to eight years after high school: A report from the National Longitudinal Transition Study-2 (NLTS2). (NCSER 2011–3005). National Center for Special Education Research. [Google Scholar]
- National Technical Assistance Center on Transition (NTACT). (2020). National Technical Assistance Center on Transition: Effective Practices. Retrieved 03/10/2020 from https://transitionta.org/
- Palinkas LA, Spear SE, Mendon SJ, Villamar J, Reynolds C, Green CD, Olson C, Adade A, & Brown CH (2019). Conceptualizing and measuring sustainability of prevention programs, policies, and practices. Translational Behavioral Medicine, 10(1), 136–145. 10.1093/tbm/ibz170 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Perez Jolles M, Lengnick-Hall R, & Mittman BS (2019). Core Functions and Forms of Complex Health Interventions: a Patient-Centered Medical Home Illustration. J Gen Intern Med, 34(6), 1032–1038. 10.1007/s11606-018-4818-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Persch AC, Clearly DS, Rutkowski S, Malone HI, Darragh AR, & Case-Smith JD (2015). Current practices in job matching: A Project SEARCH perspective on transition. Journal of Vocational Rehabilitation, 43(3), 259–273. DOI: 10.3233/JVR-150774 [DOI] [Google Scholar]
- Politis Y, Robb N, Yakkundi A, DIllenburger K, Herbertson N, Charlesworth B, & Goodman L (2017). People with disabilities leading the design of serious games and virtual worlds. International Journal of Serious Games, 4(2), 63–73. 10.17083/ijsg.v4i2.160 [DOI] [Google Scholar]
- Power K, Hogansen J, Geenen S, & Powers LE (2008). Gender matters in transition to adulthood: A survey study of adolescents with disabilities and their families. Psychology in the Schools, 45, 349–364. 10.1002/pits.20297 [DOI] [Google Scholar]
- Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, & Hensley M (2011). Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health, 38(2), 65–76. 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Qualtrics. (2005). Qualtrics. In July 2017. https://www.qualtrics.com
- Reimers TM, & Wacker DP (1988). Parents’ Ratings of the Acceptability of Behavioral Treatment Recommendations Made in an Outpatient Clinic: A Preliminary Analysis of the Influence of Treatment Effectiveness. Behavioral Disorders, 14(1), 7–15. 10.1177/019874298801400104 [DOI] [Google Scholar]
- Rosales R, & Whitlow H (2019). A component analysis of job interview training for young adults with autism spectrum disorder. Behavioral Interventions, 34(2), 147–162. 10.1002/bin.1658 [DOI] [Google Scholar]
- Sarrett JC (2017). Interviews, disclosures, and misperceptions: autistic adults’ perspectives on employment related challenges. Disability Studies Quarterly, 37(2). https://doi.org/DOI: 10.18061/dsq.v37i2.5524 [DOI] [Google Scholar]
- Schrank FA, McGrew KS, & Mather N (2014). Woodcock-Johnson IV. Rolling Meadows, IL: Riverside. [Google Scholar]
- Smith MJ, Fleming MF, Wright MA, Jordan N, Humm LB, Olsen D, & Bell MD (2015). Job Offers to Individuals With Severe Mental Illness After Participation in Virtual Reality Job Interview Training. Psychiatr Serv, 66(11), 1173–1179. 10.1176/appi.ps.201400504 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Smith MJ, Fleming MF, Wright MA, Losh M, Humm LB, Olsen D, & Bell MD (2015). Brief report: vocational outcomes for young adults with autism spectrum disorders at six months after virtual reality job interview training. J Autism Dev Disord, 45(10), 3364–3369. 10.1007/s10803-015-2470-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Smith MJ, Ginger EJ, Wright K, Wright MA, Taylor JL, Humm LB, Olsen DE, Bell MD, & Fleming MF (2014). Virtual reality job interview training in adults with autism spectrum disorder. J Autism Dev Disord, 44(10), 2450–2463. 10.1007/s10803-014-2113-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Smith MJ, Smith JD, Fleming MF, Jordan N, Oulvey EA, Bell MD, Mueser KT, McGurk SR, Spencer ES, Mailey K, & Razzano LA (2019). Enhancing individual placement and support (IPS) - Supported employment: A Type 1 hybrid design randomized controlled trial to evaluate virtual reality job interview training among adults with severe mental illness. Contemp Clin Trials, 77, 86–97. 10.1016/j.cct.2018.12.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Southward JD, & Kyzar K (2017). Predictors of competitive employment for students with intellectual and/or developmental disabilities. Education and Training in Developmental Disabilities, 52(1), 26–37. doi: 10.2307/26420373. [DOI] [Google Scholar]
- Spencer S, Drescher T, Sears J, Scruggs AF, & Schreffler J (2019). Comparing the efficacy of virtual simulation to traditional classroom role-play. Journal of Educational Computing Research. 10.1177/0735633119855613 [DOI] [Google Scholar]
- Stirman SW, Gamarra J, Bartlett B, Calloway A, & Gutner C (2017). Empirical Examinations of Modifications and Adaptations to Evidence-Based Psychotherapies: Methodologies, Impact, and Future Directions. Clin Psychol (New York), 24(4), 396–420. 10.1111/cpsp.12218 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stirman SW, Miller CJ, Toder K, & Calloway A (2013). Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci, 8, 65. 10.1186/1748-5908-8-65 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Strickland DC, Coles CD, & Southern LB (2013). JobTIPS: A Transition to Employment Program for Individuals with Autism Spectrum Disorders. J Autism Dev Disord. 10.1007/s10803-013-1800-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sung C, Connor A, Chen J, Lin CC, Kuo HJ, & Chun J (2018). Development, feasibility, and preliminary efficacy of an employment-related social skills intervention for young adults with high-functioning autism. Autism, 1362361318801345. 10.1177/1362361318801345 [DOI] [PubMed] [Google Scholar]
- Taylor JL, Henninger NA, & Mailick MR (2015). Longitudinal patterns of employment and postsecondary education for adults with autism and average-range IQ. Autism, 19(7), 785–793. 10.1177/1362361315585643 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thorsteinsson G, & Shavinina LV (2013). Developing an understanding of the pedagogy of using a virtual reality learning environment (VRLE) to support innovation education. Routeledge, New York, NY. [Google Scholar]
- Tross SA, & Maurer TJ (2008). The effect of coaching interviewees on subsequent performance in structured experience-based interviews. Journal of Occupational and Organizational Psychology, 81, 589–605. 10.1348/096317907X248653 [DOI] [Google Scholar]
- U.S. Department of Education. (2017). A transition guide to postsecondary education and employment for students and youth with disabilities. Washington, D.C.: U.S. Department of Education. Retrieved from https://www2.ed.gov/about/offices/list/osers/transition/products/postsecondary-transition-guide-may-2017.pdf. [Google Scholar]
- U.S. Department of Education. (2018). What Works Clearinghouse. Washington, D.C.: National Center for Education Evaluation and Regional Assistance. Retrieved from https://ies.ed.gov/ncee/wwc/2018yearinreview. [Google Scholar]
- U.S. Department of Education. (2019). Table 219.90. Number and percentage distribution of 14- through 21-year-old students served under Individuals with Disabilities Education Act (IDEA), Part B, who exited school, by exit reason, sex, race/ethnicity, age, and type of disability: 2015–16 and 2016–17. National Center for Education Statistics. Retrieved from https://nces.ed.gov/programs/digest/d18/tables/dt18_219.90.asp. [Google Scholar]
- Ward DM, & Esposito MCK (2018). Virtual reality in transition program for adults with autism: self-efficacy, confidence, and interview skills. Contemporary School Psychology, 1–9. https://doi.org/ 10.1007/s40688-018-0195-9 [DOI] [Google Scholar]
- Wechsler D (2014). Wechsler Intelligence Scale for Children. Bloomington, MN: Pearson. [Google Scholar]
- Wehman P, Schall C, McDonough J, Sima A, Brooke A, Ham W, Whittenburg H, Brooke V, Avellone L, & Riehle E (2019). Competitive Employment for Transition-Aged Youth with Significant Impact from Autism: A Multi-site Randomized Clinical Trial. J Autism Dev Disord. 10.1007/s10803-019-03940-2 [DOI] [PubMed] [Google Scholar]
- Wehmeyer ML, & Palmer SB (2003). Adult outcomes for students with cognitive disabilities three-years after high school: the impact of self-determination. Education and Training in Developmental Disabilities, 38(2), 131–144. www.jstor.org/stable/23879591 [Google Scholar]
- Workforce Innovation and Opportunity Act, Pub. L. No. PL 113–128, 29 U.S.C. § 3164 (2014). https://www.doleta.gov/wioa/
