Promoting evidence-based practices (EBP) is a national priority (Chambers, Ringeisen, & Hickman, 2005; Chambers, Glasgow, & Stange, 2013). However, while great progress has been made with regard to training practitioners to deliver mental health EBP effectively, the investment in training has not translated into institutional capacity to sustain high quality services in community settings (see Novins, Green, Legha, & Aarons, 2013; Stirman et al., 2012). More specifically, the investment of financial and human capital into training and initial consultation of front-line workers has not led to an understanding of the mechanisms and behaviors that might contribute to lasting change (Powell et al., 2015). Therefore, a key challenge facing the mental health field is to create an infrastructure to support knowledge transfer and sustain effective practices over time. This situation is not unique to mental health (see Salas, Tannenbaum, Kraiger, & Smith-Jentsch, 2012). In the U.S., over $125 billion is spent every year in business, education, and health to improve job performance and organizational productivity (Paradise, 2007), yet it is estimated that only 10 to 15 percent of training efforts transfer to improved job performance (Georgenson, 1982). In fact, one report estimated that up to 40 percent of employees fail to apply what they learned in training immediately after training, and this number rises to 70 percent one-year post-training (Saks, 2002).
Elements of an effective infrastructure to support sustainability
Multiple elements are required to support sustainment of effective practices in community settings that ultimately impact client outcomes (see Aarons, Hurlburt, & Horwitz, 2011; Damschroder et al., 2009). Two mechanisms identified as critical to long-term sustainability include ongoing training and supervision/support (Massatti, Sweeney, Pazano, & Roth, 2008; Peterson et al., 2014) and effective and efficient mechanisms for fidelity and outcome monitoring (Bond et al., 2014; Peterson et al., 2014). Theoretically, incorporating these mechanisms into a context that prioritizes ongoing learning and problem solving to adapt interventions for the context and setting, and high expectations for quality improvement over time, can contribute to an infrastructure that supports sustainability (Hamilton & Bickman, 2008; Chambers et al., 2013; Schoenwald, Garland, Southam-Gerow, Chorpita, & Chapman, 2011). Furthermore, if the context is supportive, staff turnover and negative staff attitudes toward monitoring may be reduced (Aarons, Sommerfeld, Hecht, Silovsky, Chaffin, 2009; Nelson, Teutsch, Chung, Herman, 2014). Unfortunately, these types of infrastructures are rare within the current organizational structure of the mental health system (Nadeem, Olin, Hill, Hoagwood, & Horwitz, 2013).
State mental health departments respond to the challenge
State mental health departments are in a unique, natural position to contribute to the creation of a structure to support and sustain evidence-based practices in community mental health settings. States have critical influence on agencies and providers via policies, funding, and the opportunity to provide technical support (Aarons et al., 2011; Peterson et al., 2014). Providing the initial training, consultation, and technical support for individual agencies are key elements of many state initiatives supporting the implementation of evidence-based practices in community mental health settings (Aarons et al., 2011, removed for blind review). However, ongoing technical support, including supporting the ongoing monitoring, consultation, and supervision needed for sustainability of effective practices is challenging and expensive (Beidas & Kendall, 2010).
The task of supporting providers to continue using EBP post-training and consultation was a challenge facing a state Evidence Based Practice Initiative (State EBPI) (reference deleted for blind review). The state developed an empirically informed 12-month training and consultation model in collaboration with the second author and in which the first author was a consultant and trainer. The training and consultation model included 8 days of in-person training and every-other-week consultation over 12 months, and access to the Modular Approach to Therapy for Children with Anxiety, Depression, Trauma, or Conduct Problems (MATCH-ADTC, Chorpita & Weisz, 2009; see [reference removed for blind review] for details). Data collected at 90-days post-treatment initiation showed that during the training and consultation period children treated by trained clinicians improved compared to children treated by non-trained clinicians (reference removed for blind review).
In a series of discussions between the first author and the leadership of the State EBPI, our team was asked to explore the implementation of professional learning communities (PLC) to support providers post-training and consultation. A mechanism that promoted peer support was appealing to the state because it leveraged existing resources. The EBTI incorporated agencies from across the state, thus a virtual professional learning community (VPLC) was developed to decrease travel costs and take advantage of technology to bring together providers who might otherwise be isolated due to geography (Mitchell, Robinson, Seiboth, & Koszegi, 2000, Swan & Shea, 2005).
Professional Learning Communities (PLC)
A professional learning community (PLC) is a model of collaborative learning that is widely used within health and education to support implementation and dissemination of effective practices (Bryk, Camburn, & Louis, 1999; Lomos, Hofman, & Bosker, 2011; Vescio, Ross & Adams, 2008; Wenger, 1998). Ultimately, a PLC is developed with the goal of becoming a long-standing group that establishes a collaborative work environment through which participants communicate and learn from each other over time, engaging in a continuous learning process to improve practice (Vescio et al., 2008; Wenger, 1998). Theoretically, collaboration between PLC members strengthens the learning and transfer of tacit (practical) knowledge within organizations, as well as explicit (codified) knowledge, which is key when attempting to sustain effective practices that are difficult to codify (Lam, 2000), such as complex mental health interventions.
PLCs can promote ongoing, collaborative learning and can influence member behavior via shared norms and values, leading to increased productivity (Kruse, Louis, & Bryk, 1994; Wenger, 1998). Participation in a PLC encourages members to broaden their knowledge base, see their own practices from a different perspective, and solicit advice from others, resulting in improved practices and student productivity (Bryk et al., 1999). Furthermore, these processes lead to strengthened social connections between peers, and expanded networks that expose practitioners to more and varied expertise (Bunger et al., 2016; removed for blind review). PLCs also lead to shared norms for specific practices as well as a mechanism for training of new staff (Bryk et al., 1999; Wenger, 1998). Barwick, Peters, and Boydell (2009) created and studied a PLC in a community mental health setting to support the implementation of a standardized outcome measurement tool, The Child and Adolescent Functional Assessment Scale (Hodges, 2003). Results indicated that providers assigned to the PLC condition used the tool more often and demonstrated greater knowledge of the tool than providers assigned to a support-as-usual condition (Barwick et al., 2009). Additionally, reviews of PLC use in educational settings indicated that PLC participation improved teacher practices and increased student achievement (Dogan, Pringle, & Mesa, 2016; Vescio et al., 2008).
Online strategies, including virtual communities of practice or virtual professional learning communities (VPLC), are increasingly being used to disseminate knowledge and support implementation of effective practices (Mairs, McNeil, McLeod, Prorok & Stolee, 2013). However, the research exploring the use and mechanisms for success of VPLCs is lacking (see Barnett, Jones, Bennett, Iverson, & Bonney, 2012). In this paper, we proposed a VPLC to decrease travel costs and take advantage of technology to bring together providers who might otherwise be isolated due to geography (Mitchell et al., 2000).
Social Network as a Mechanism for Effective Professional Learning Communities
Bryk and Schneider (2002) proposed that the positive effects of PLC participation result, in part, from cultivating social connections among colleagues, which foster a willingness to share ideas, learn from each other, and implement innovative practices. These social connections promoted and supported by a PLC can be examined as a social network (see Valente, Palinkas, Czaja, Chu, & Brown, 2015). In the abstract, a social network is set of individuals “connected by one or more relations” within a particular context (Marin & Wellman, 2011, p. 11). The role of social networks in disseminating and implementing new practices has been demonstrated in several fields, including education (removed for blind review; Valente, et al., 2015). In particular, two features of social networks that may support the use of innovative practices include high levels of network density (i.e., the proportion of existing ties between members of a setting) (see Moolenaar, Daly, & Sleegers, 2011) and ties to colleagues outside one’s regular setting (see Palinkas, Holloway, Rice, Fuentes, Wu, & Chamberlain, 2011). Thus, PLCs may be successful when they enhance these two structural features of a social network among members.
High levels of network density have been linked to a range of positive outcomes with respect to dissemination and implementation. For example, Moolenaar and colleagues examined the impact of existing teacher social networks on school reform efforts, collective decision-making, and school climate (Daly, Moolenaar, Bolivar, & Burke, 2010; Moolenaar, 2012; Moolenaar, Sleegers, & Daly, 2012). They demonstrated that denser teacher social networks were associated with increased shared decision-making, which has been associated with positive student outcomes (Daly et al., 2010; Moolenaar, 2012; Moolenaar et al., 2012). Dense teacher networks were also associated with a school climate supportive of innovation (Moolenaar, Daly, & Sleegers, 2011) and a higher likelihood of seeking advice from colleagues (Daly et al., 2010).
While network density is linked to positive outcomes and encourages advice seeking, a valuable component of supporting innovations, establishing ties to individuals outside one’s regular setting is important because it can provide access to new information (Palinkas, et al., 2011). For example, Bunger and colleagues (2014) demonstrated that participating in a learning collaborative shifted participant social networks such that new relationships were formed with training experts. Yousefi-Nooraie, Dobbins, Marin, Hanneman, and Lohfeld (2015) demonstrated that engaging in an organizational intervention to promote evidence-informed decision-making shifted the social network of each of three large public health units to center around an expert within the organization. Participating in a PLC comprised of individuals from a variety of settings can maintain connections and encourage advice seeking, in addition to providing opportunities for providers to expand their network, increasing access to a broad range of information. That is, the activation and expansion of advice networks fostered in a PLC have the potential to support continued use of practices. Based on this research it is possible that the mechanisms underlying a successful PLC are two structural aspects of a social network: network density and ties to colleagues outside the network. We have found no studies that have explored the impact of PLC participation on mental health providers’ social networks or communication, or the implementation of a virtual PLC post-training in mental health settings.
Present Study
Two Virtual Professional Learning Communities (VPLCs) were established for providers who had participated in the State EBTI training and consultation. In this paper we describe the development and implementation of the VPLC, explore the feasibility for community-based providers and utilize social network analyses to examine whether the VPLC social network contained elements important to implementation of innovative practices: connection among providers and potential to access to information. In particular, we used social network analyses to examine two features of the social network of the VLPCs that represent connections among providers and access to information: (1) network density (e.g., the extent of ties between VLPC members) and (2) boundary density (e.g., the extent of cross-agency ties among VPLC members).
Method
This study was conducted with approval from the Institutional Review Board for recruitment, informed consent, and data collection procedures.
Developing the VPLC
During the time that the first author was a consultant and trainer for the state EBPI she engaged in multiple conversations with leadership focused on the goals and challenges facing the state EBPI. A primary challenge, providing ongoing support to providers across the state to support sustainability of effective practices was often discussed, along with potential solutions, including the development and implementation of a VPLC. After much discussion, state leadership decided that exploring the implementation of a VPLC post-training and consultation was feasible and acceptable. The first author developed a specific structure and process for the VPLC based on empirical literature of PLCs, social networks, and effective supervision.
VPLC structure.
The structure of the VPLC drew on the empirical literature of social networks (Burt, 1999) and PLCs (see Bryk & Schneider, 2002; Wegner, 1998). Multiple modes of communication foster advice networks and increase the probability that social ties will be activated (Ardichvili, 2008). Therefore, each VPLC allowed for synchronous and asynchronous forms of communication allowing flexibility in how participants interacted with each other (Swan & Shea, 2005). Synchronous communication was supported by an in-person meeting followed by five consecutive monthly conference calls. Asynchronous communication was supported by an online-group (Yahoo-group) in which participants could post questions and resources between meetings. A Yahoo-group was utilized because the state had previously used a Yahoo-group and was familiar with the technology.
VPLC process.
The process each VPLC employed to support effective practices was based on problem-based learning (PBL, Vernon & Blake, 1993) and behavioral consultation (Kratochwill, Elliot, & Busse, 1995). PBL is an empirically informed teaching method that encourages collaborative problem solving and application of knowledge in practice settings (Dochy, Segers, Van den Bossche, & Gijbels, 2003; Gijbels, Dochy, Van den Bossche, & Segers, 2005; Smits, Verbeek, & de Buisonjé, 2002). Behavioral consultation is an effective service delivery model based on four general components: problem identification, problem analysis, treatment implementation and treatment evaluation (Kratochwill et al., 1995; Sheridan, Salmon, Kratochwill, & Carrington Rotto, 1992). These components have been used effectively in the supervision of Multisystemic Therapy (Schoenwald, Brown, & Henggeler, 2000, Schoenwald, Sheidow, & Chapman, 2009) and the implementation of EBPs by psychiatric nurses (Bradshaw, Butterworth, Mairs, 2007). To support provider use of the problem solving focus and the behavioral consultation framework, a case note was introduced to participants. PBL and behavioral consultation provide a framework for providers to share their work, collaborate on solutions, reflect on their own work and promote reflective dialogue, all key elements of a successful VPLC (see Bryk et al., 1999) and supportive of social network structures that encourage implementation of innovative practices (Palinkas et al., 2011; Valente et al., 2015).
Social Network Structures.
Two structural features of social networks were examined: network density and boundary density. Density assesses how connected the individuals in a network are to one another (Hanneman & Riddle, 2005). Network density is a structural feature of a social network and is defined as a proportion of all possible ties between individuals in a given network (Hanneman & Riddle, 2005; Wasserman & Faust, 1994). Few connections are represented by low density, while many connections are represented by higher density. The density of each VPLC was examined at six time points. For example, in the network of VPLC1 consisting of 10 providers, there are 90 possible ties (each of 10 individuals has a potential tie with 9 other individuals; 10×9=90). If providers report 30 ties, then the density of the network is 30/90=.33.
Boundary density indicates the extent to which individuals’ relations extend across group boundaries (see Birkel & Reppucci, 1983), which indicates whether access to information includes channels outside of a particular group. Given two groups, A and B, the boundary density is the proportion of all possible connections that occur between group A and B (see Hirsch, 1980). In this study, boundary density is defined as the proportion of all possible cross-agency ties within each VPLC. For example, 10 consented providers from four agencies comprised VPLC1: 3 from Agency A, 3 from Agency B, 3 from Agency C, and 1 from Agency D. The possible number of cross-agency connections is (3×7) + (3×7) + (3×7) + (1×9)=72. If providers report 25 cross-agency connections, the boundary density is 25/72=0.34. The boundary density of each VPLC was examined at six time points.
Sample
All agencies (N = 8) that participated in the State EBTI training and consultation from July 2012 through June 2013, and their respective providers (N = 24), agreed to participate in one of two VPLCs beginning in January 2014. The agencies were located across the state: Four agencies served primarily rural populations and four served primarily urban populations. All agencies except one had multiple sites and all were located in counties with between 12% and 22% of the population residing in poverty (mean = 16.7%). Agencies were given a choice of two dates for the first, in-person VPLC. By virtue of their choices, four agencies comprised VPLC1 (three urban, one rural) and four agencies comprised VPLC2 (three rural, one urban). State EBTI incentivized agencies to encourage providers to attend by providing monetary compensation for billing time lost due to VPLC participation. Twenty-three of 24 providers participated in the 12-month training and consultation; one supervisor joined an agency during the six-month lag between end of consultation and beginning of the VPLC. There was a six-month lag between end of consultation and beginning of VPLC due to state budget constraints. Providers participated in their VPLC for six months (VPLC1 n = 13, VPLC2 n = 11). Of the 24 VPLC participants, 17 consented to provide demographic and social network data (VPLC1 n = 10; VPLC2 n = 7). Demographic data for participants is displayed in Table 1, which indicates that the two VPLC groups were largely similar in demographic characteristics. Each group was primarily female and Caucasian with a similar mean age of 38 and 35, respectively. All participants were Master’s level and about half were licensed (6 of 10 and 4 of 7, respectively), with similar years at current agency and work experience. Unfortunately, demographic data were not available for providers participating in the VPLC but who did not consent to research.
Table 1.
VPLC1 (N = 10) |
VPLC2 (N = 7) |
|
---|---|---|
Gender | Female: 9 Male: 1 |
Female: 7 Male: 0 |
Race | White: 8 Black: 2 |
White: 5 Asian: 1 Other: 1 |
Age | M = 38.00 | M = 35.43 |
Education | MA: n = 10 | MA: n = 7 |
License | n = 6 | n = 4 |
Years at current agency | M = 7.0 | M = 7.0 |
Work experience (years) | M = 10.7 | M = 9.2 |
Implementing the Virtual Professional Learning Community
Initial in-person meeting.
The first VPLC meeting of each VPLC (VPLC1 and VPLC2) was in-person and facilitated by the first author. Participants traveled to the site utilizing money from a grant awarded to each agency by the state. The goals of the first meeting were for the participants to re-acquaint themselves with each other (they initially met during the State EBTI trainings) to encourage connections, and to introduce and solicit provider feedback regarding, the goals, structure, and process of the VPLC to increase acceptability and feasibility The agenda for the first meeting of each VPLC was the same: Introduction, purpose, meet and greet, goals, VPLC format, next steps, and wrap up.
Meetings 2 though 6: Phone conferences.
The remaining VPLC meetings were phone conferences. Each VPLC followed a standard agenda: 1) Quick overview/announcements, 2) Questions/Case presentation, 3) Resources, 4) Next Steps, and 5) Wrap-up. During two VPLC meetings (depression and anxiety, respectively) an expert joined the VPLC to support the discussion and peer support. During every VPLC the facilitator encouraged discussion among the providers.
Dependent Measures
Attendance.
Attendance at each VPLC was collected by recording participant sign-in on conference calls.
VPLC social networks.
Data on VPLC social networks were collected from consenting participants at six time-points and analyzed using UCINET 6 (Borgatti, Everett, & Freeman, 2002). Providers received, via email, a unique link from a data management and capture system (REDCap) that is HIPPA compliant and available at the first author’s institution. The link was sent within one week after each VPLC, and providers’ response times ranged from one to three weeks. Data were collected using well-established sociometric survey-based methods for assessing social networks (see Marsden 1990, 2011; Wasserman & Faust, 1994). Sociometric data collection involves asking each person in a network to identify each “alter” in the network with whom he/she has a specific type of relationship and is typically assessed with a single item (see Marsden, 2011; Wasserman & Faust, 1994). In this study, a roster of participants (e.g., consented VPLC members) was presented to each participant and the participant responded to a single item about their relationship with each other person in the social network (see Marsden, 1990, 2011). Providing a roster simplifies data collection in small settings like VPLCs and ensures that members are less likely to make errors of omission (Marsden, 2011; removed for blind review). We were specifically interested in the structure of the advice network among VPLC members, a typical type of social network assessed within social network studies (Palinkas et al., 2011 Palinkas et al., 2013; Wasserman & Faust, 1994). It is common to collect network data regarding different types of relationships (see Mardson, 2011; Wasserman & Faust, 1994) and in this study two separate VPLC advice networks were constructed: intent to contact and actual contact.
Intent to contact represents an ideal social network of the individuals VPLC members felt comfortable contacting. This ideal social network represents the participants each VPLC member would like to connect with in an ideal context if there were no barriers (e.g., time or geography). Actual contact represents the actual contact among VPLC members, or their actual behavior, which likely is impacted by the numerous barriers that may exist to communication among providers.
Intent to contact.
Sociometric data were collected to assess providers’ intent to contact other members of the VPLC by answering the question, “How likely would you be to contact each of the other providers in the VPLC to ask for help with an issue about EBP implementation?” Responses ranged from 1 (very likely) to 4 (very unlikely). Palinkas et al., (2011) used this type of question to assess advice networks regarding EBP implementation. To facilitate analyses, data were dichotomized; scores of 1 and 2 were combined to indicate intent to contact and scores of 3 or 4 were combined to indicate no intent to contact.
Actual contact.
Sociometric data were collected to assess providers’ actual contact with other members of the VPLC by answering the question, “How many times have you talked to the other providers in the last month, outside of the VPLC?” Responses ranged from 0 times to 4 times or greater. A question similar to this has been used in studies of obesity intervention (see Gesell, Baukin, Sommer, Thompson, & Valente, 2015) to assess communication among providers outside of the context of the group. Again to facilitate analyses scores were dichotomized; for this question scores greater than 0 were scored as contact.
The number of individuals who consented to provide social network data within each VPLC (i.e., actor response rate) was 77% (10 of 13 participants) for VPLC1 and 64% (7 of 11 participants) for VPLC2. Non-consenting individuals were not included in the measurement of social networks for the two VPLCs. Therefore, these actor response rates mean that we were able to collect data about up to 57.7% (i.e. 10(9)/13(12)) of the total contact relationships in VPLC1, and up to 38.2% (i.e., 7(6)/11(10)) of the total contact relationships in VPLC2. These response rates are slightly lower in some time points because, in several cases, participants did not provide complete data for a specific VPLC at a particular time point. In all cases, the data that were not reported were treated as missing data in the analyses. For example, at time four for VPLC1, one participant provided data for eight of the nine possible relationships, thus data were present for only 89 of the 156 possible relationships in that social network (57%). In a recent review of community-based social network studies, [removed for blind review] found that reported actor response rates ranged from 44% to 100%. Because our response rates are low compared to other community-based studies, we interpret network results with caution.
Social network analyses.
Social network analyses were utilized to explore whether the structures important to supporting innovative practices were present within the VPLC. Data from each question were used to develop VPLC advice networks at each of the six points, resulting in 12 advice networks for each VPLC. Two measures were computed at each time point: density and boundary density (see previous description for definitions).
Results
Implementation
First in-person meeting.
As stated previously, during the first session, providers in both groups were introduced to the goals and format (structure and process [e.g., monthly phone calls, Yahoo group to share resources, problem solving focus, sharing ideas and challenges]) of the VPLC and asked to provide feedback. It should be noted that no changes were made to the overarching structure and process of the proposed VPLC. Providers in VPLC1 and VPLC2 decided that the best time to schedule the monthly call was around the lunch hour, when other demands were less likely to interfere with their time. The five subsequent VPLC monthly meetings were phone conferences facilitated by the first author.
In order to increase acceptability and meet provider needs for VPLC content, each group reflected on what was helpful from the initial training/consultation period and decided which topics would be addressed in the following five VPLCs. These discussions culminated in the following list of topics (each groups independently decided on the same list of topics): Parent/client engagement, Trauma, Depression, Anxiety, and using the online MATCH-ADTC materials (Chorpita & Weisz, 2009) provided by the state. Providers from both groups decided that each VPLC should consist of discussion about actual cases pertaining to the specific topic of that VPLC. The providers in VPLC1 decided that during each VPLC an agency would provide a case presentation pertaining to the VPLC topic. Providers in VPLC2 decided to post or send discussion questions pertaining to a case focused on the VPLC topic to the facilitator before each VPLC, which would be the basis for discussion.
Meetings 2 though 6: Phone conferences.
As previously mentioned, each phone conference followed a standard agenda: 1) Quick overview/announcements, 2) Questions/Case presentation, 3) Resources, 4) Next Steps, and 5) Wrap-up. In each session, cases and questions pertaining to the topic were discussed. The facilitator encouraged peer support for challenging cases, often reflection questions back to the group to solicit feedback from peers. Occasionally technical difficulties (e.g., conference line not working; bad connections) interfered with phone conferencing. During the last session, in which the facilitator attempted to demonstrate using MATCH by sharing her screen, some participants could not access the meeting software due to restrictions put into place at their agency. Feedback at the end of the VPLCs suggested that some participants would have preferred video conferencing, consultation was generally helpful, and meetings could act as a reminder to use strategies learned during training.
Attendance
Across the two groups, 23 of 24 clinicians (96%) attended at least three of six sessions and 20 of 24 clinicians (83%) attended at least four of six sessions. In VPLC1, all participants attended at least 50% of sessions (2 participants attended 3 sessions; 2 participants attended 4 sessions; 3 participants attended 5 sessions, and 6 participants attended all 6 sessions). In VPLC2, 10 of 11 participants attended at least 50% of sessions (1 participant attended 3 sessions; 4 participants attended 4 sessions; 2 participants attend 5 sessions, and 3 participants attended all 6 sessions); one provider attended two sessions. Providers reported a variety of reasons for missed sessions, including client emergencies, professional trainings, internal agency trainings/meetings, and personal time off.
VPLC social networks
Table 2 presents density scores for each VPLC on each question. For the first question that assessed intention to contact others, density scores ranged from 0.49 to 0.64 for VPLC1 (i.e., 49% to 64% of possible ties), and from .75 to 0.83 for VPLC2. These scores indicate high density as suggested by Mayhew & Levinger (1976), who proposed .50 as a threshold for high density.
Table 2.
Who are providers likely to contact: “How likely would you be to contact each of the other providers in the VPLC to ask for help with an issue about EBP implementation?” | ||||||
Time | ||||||
1 | 2 | 3 | 4 | 5 | 6 | |
VPLC1 | ||||||
Densitya | 0.49 | 0.57 | 0.64 | 0.59 | 0.60 | 0.61 |
B. Densityb | 0.34 | 0.46 | 0.56 | 0.46 | 0.46 | 0.53 |
VPLC2 | ||||||
Density | 0.75 | 0.83 | 0.78 | 0.75 | 0.83 | 0.75 |
B. Density | 0.68 | 0.77 | 0.71 | 0.68 | 0.81 | 0.68 |
Who do providers contact: “How many times have you talked to the other providers in the last month, outside of the VPLC?” | ||||||
Time | ||||||
1 | 2 | 3 | 4 | 5 | 6 | |
VPLC1 | ||||||
Density | 0.31 | 0.24 | 0.36 | 0.23 | 0.23 | 0.46 |
B. Density | 0.14 | 0.06 | 0.19 | 0.04 | 0.03 | 0.31 |
VPLC2 | ||||||
Density | 0.35 | 0.41 | 0.33 | 0.35 | 0.17 | 0.35 |
B. Density | 0.16 | 0.23 | 0.14 | 0.16 | 0.00 | 0.16 |
Proportion of all possible ties in social network;
Boundary density: proportion of all possible cross-agency ties in social network
For the second question that assessed actual contact, density scores were considerably lower which is likely related to the fact that actually connecting with colleagues is constrained by contextual barriers (e.g., space, time, logistics). These scores ranged from 0.23 to 0.46 for VPLC1, and 0.17 to 0.41 for VPLC2. Based on Mayhew & Levinger (1976) criteria, these were considered low to moderate density scores.
Table 2 presents boundary density scores for each VPLC on each question. For the first question related to intention to contact other VPLC members, across time points, the boundary density ranged from .34 to .56 for VPLC1 and from .68 to .81 for VPLC2. These scores indicate moderate to high density as suggested by Mayhew & Levinger (1976).
For the second question related to actual contact between sessions, boundary density ranged from .03 to .31 for VPLC1 and .00 to .23 for VPLC2. This indicates that despite intentions to contact VLPC members outside one’s organization, it was relatively rare for VLPC members to actually contact members outside one’s organization between sessions.
Asynchronous communication
The strategy implemented to support asynchronous communication, the Yahoo-group, was not well utilized, demonstrating the lack of feasibility for this component of the VPLC. During VPLC sessions, participants noted that the Yahoo-group was cumbersome and difficult to navigate. Based on email updates received by the facilitator for each Yahoo group, only one provider from VPLC1 and no one from VPLC2 posted resources on the Yahoo group site. However, several participants used email in lieu of the Yahoo group. Two participants sent email to the facilitator to distribute to the group, two participants sent resources directly to other members of her VPLC, and another participant sent a question regarding client engagement via email to the group, for which she received a response from another VPLC member.
Discussion
Many public and private entities providing community mental health services face the challenge of effectively and efficiently supporting providers post-training and consultation, a critical element of sustaining effective practices. In the course of our work with the State EBPI we had the opportunity implement two six-month VPLCs as a follow up to training and consultation. The implementation of VPLCs was an opportunity to develop a VPLC for a large state system, explore the feasibility of the VPLC in a state system, and utilize social network analyses to explore whether structural elements of a social network that can support implementation of innovative practices were present among VPLC participants.
Based on participant feedback, the VPLC content focused on elements of evidence based practices, such as motivational interviewing, trauma narratives, and elements of parent management training. The sessions integrated problem-based learning and behavioral consultation principles into the process, which was designed to facilitate interaction and participant involvement. In fact, participants appeared excited about receiving support for EBP implementation post-training and consultation, and were especially enthusiastic about the opportunity to support one another and gain access to additional information regarding effective practices. At the first in-person meetings, we observed that providers were highly engaged in reflecting on challenging cases and receiving and providing consultation from peers; conversation flowed easily and naturally. This spirit was aptly expressed by one of the providers who stated, “Any professional consultation is great, I think, especially around a community of your peers.” Conducting this pilot study allowed us to move beyond perceived acceptability for collaboration and to examine the feasibility of a VPLC and the possibility that VPLC participation can connect providers to one another in a manner that supports innovative practice.
Attendance data for this brief trial supported the feasibility of the monthly meetings, despite high demands on provider time. All but two of the twenty-four providers attended at least 50% of the sessions (i.e., 3 of 6), and 20 of 24 providers attended 4 or more sessions. This level of attendance is a testament to the effort providers invested in the VPLC despite other demands including the pressure to see clients, attend other agency meetings, and complete agency paperwork. The impact of these demands became clearer toward the end of the VPLC when attendance decreased, despite efforts to schedule the VPLC at a time when demands on their time would be lowest (i.e., the lunch hour). Interestingly, although the State EBTI incentivized agencies to encourage providers to attend by providing monetary compensation for billing time lost due to VPLC participation, this incentive did not trickle down to participants, as billing expectations were consistently high. This speaks to the complexity of factors impacting the promotion of EBPs in community mental health, including aligning policy and incentives to address staff priorities (Raghavan, Bright, Shadoin, 2008). In future iterations, it will be important to examine whether incentives matched to provider needs (e.g., direct compensation), in addition to or in lieu of agency needs, would enhance attendance more successfully.
A Yahoo-group where participants could upload resources and post and respond to questions was included as a method of sharing resources and supporting asynchronous communication. Unfortunately, use of the Yahoo group was low due to logistical barriers such as lack of access to the internet and lack of time, as well as agency policy prohibiting practitioners from accessing public websites. It is interesting to note that when the Yahoo-group or email was used to communicate with other providers in the VPLC, it was usually used to share concrete resources (e.g., template for safety plan or trauma narrative), rather than exchange ideas or problem solve (e.g., how do I engage the parent who is not showing up consistently?). Speculatively, it may be that a platform organized to share and access concrete resources is critical to providers, rather than asynchronous communication to share question or problem solve. It is possible that a VPLC provides a context for problem solving and consultation, while an online platform is useful as a repository of information and concrete resources.
Several providers also mentioned that video conferencing would have been preferable to phone conferencing so that participants could see each other’s faces and non-verbal communication. However, while a low-tech solution such as phone conferences might seem outdated in the age of Skype, video-conferencing, Google chat, etc., the logistical barriers impacting use of the Yahoo group strongly suggests that agencies would not have the capacity to manage advanced technology such as video-conferencing.
The challenge of accessing and using technology in community mental health practice is consistent with findings from other studies in large systems (Bezyak, Yan, Kang, Burke, & Chan, 2014; Nadeem, Weiss, Olin, Hoagwood, & Horowitz, 2016). This illustrates the need to carefully understand both individual factors such as technical abilities and time constraints as well as organizational factors such as IT capacity and agency policies that can hinder or facilitate the use of technological tools to enhance training and support for programs and practices. For example, Penn and colleagues (2005) decided to use only simple graphics in their development of an online resource for suicide prevention to account for limited service availability and slow access speeds in rural areas of Australia. Furthermore, building on an existing state-supported platform for sharing resources would eliminate the barrier of agency prohibition of accessing public websites (e.g., Yahoo groups). It is conceivable that simpler or more conventional modes of asynchronous communication may prove more suitable for U.S. community mental health providers, such as an email list-serv, as evidenced by provider use of email to distribute resources in this pilot. Additionally, some practitioners already use list-servs, for example the Division 53 American Psychological Association email list-serv is often used by practitioners to post questions regarding challenging clinical cases (https://clinicalchildpsychology.org/listserv). Nevertheless, as these technologies become less expensive, and as high speed broadband internet more accessible, their use to supplement direct training and ongoing support may become more feasible (see Kothari, Boyko, Conklin, Stolee, & Sibbald, 2015; Mairs, McNeil, McLeod, Prorok, & Stolee, 2013).
Social network structures supporting innovative practices
Expanded advice networks among providers can support sustained use of effective practices through increased access to information (Valente, 2012; Wasserman & Faust, 1994). In the present study, we explored whether the structures conducive to implementing innovative practices were present in provider social networks: density and boundary density. The results of this study should be interpreted with caution due to small sample size and the relatively low response rate.
The density and boundary density of social networks assessing VPLC members’ intent to contact one another ranged from moderate to high at the beginning of each VPLC and continued at these levels for the duration of the VPLC. The density levels at the beginning of each VPLC likely reflected the fact that the practitioners knew each other at the beginning of the VPLC due to the training and consultation they had previously attended. The fact that the density of the intent to contact social networks for each VPLC remained in the moderate to high range (between 49 and 64 percent in VPLC1 and between 75 and 83 percent in VPLC2 depending on the study time point) suggests that throughout the six-month VPLC the providers who provided data may have been comfortable and willing to reach out to colleagues; that is, the results suggest that the providers potentially had the intention to connect with colleagues if needed. Furthermore, moderate to high levels of boundary density (between 34 and 56 percent of possible cross-agency contacts were present in VPLC1 and between 68 and 81 percent of possible contacts were present in VPLC2) suggest that VPLC members who provided data were potentially willing to contact VPLC participants not only inside, but also outside, of their agencies during the course of the VPLCs. That is, results suggest that providers may be able to broaden their access to channels of information via VPLC participation, but it will be critical to address barriers to contact and confirm these results in future studies with a larger sample.
While participants in both VPLCs who provided data reported high levels of intention to contact colleagues, their actual contact was considerably lower. Depending on the study time point, between 23 and 46 percent of possible actual contact ties were present for VPLC1. Likewise, between 17 and 35 percent of possible actual contacts ties were present for VPLC2. Boundary densities were even lower, ranging from only 3 to 31 percent of possible cross-agency actual contact ties in VPLC1 and 0 to 23 percent of possible actual cross-agency contact ties in VPLC2. Again, these results should be interpreted with caution, but the low percentages of actual contacts could be due to the time and energy needed to maintain contact or reach out to someone, especially from another agency (Scott, 2000). Although a large social network of actual contact may not have been activated, we can speculate that the actual contact between providers in the VPLC was still likely higher than if the VPLC has not been implemented.
Take together, we are cautiously optimistic that the relatively high intent to contact density and boundary density scores, coupled with the lower actual contact density and boundary density scores, suggest that the VPLCs may have provided a context that allowed providers to feel they could contact other providers, both inside and outside their agency, but the capacity to actually contact others was lacking Thus, a VPLC could potentially broaden a provider’s access to resources and advice to support quality services, however, additional mechanisms are needed to overcome barriers to activate the use of the social networks. Clearly additional research with a larger sample is needed to further explore and potentially confirm these results.
The current study adds to the literature on virtual learning communities. Similar use of virtual communities to increase, or maintain, access to support, advice, and resources is an idea that has been suggested in the medical field (see Barnett, Jones, Bennett, Iverson, & Bonney, 2012; Mair et al., 2013) and is being studied in rural communities in Australia (see Penn et al., 2005). Virtual collaborative learning strategies to impact sustainability in mental health services is also ongoing and will provide vital information on effective strategies to sustain effective practices in community settings (see Wiltsey-Stirman, et al., 2017). Continuing to explore the potential of VPLCs and the additional support needed to expand provider advice networks across agencies to create access to additional resources is especially important for any provider who does not have access to informed colleagues or materials, such as those working in rural communities, in smaller agencies in urban or suburban communities, or even in larger agencies where colleagues may be working in relative isolation on a variety of programs; not uncommon in even larger social service agencies (Grady & Chen, 2006).
Limitations
Several limitations should be noted in this pilot study. The number of agencies and participants was relatively small, limiting generalizability, although both urban and rural communities were represented and given the limited resources, there was no control group for comparison and therefore we were not able to compare the impact of VPLC participation to practice-as-usual. Thus, replication with a larger sample of agencies and providers is important to assess regional variations and a comparison group is essential. As noted previously, providers were able to attend the first in-person meeting because their agencies received a grant from the state. Given that travel money is not standard on an agency budget, requiring an in-person meeting could impact the feasibility of this model. Alternatives to an in-person meeting for which providers need to travel significant distances should be considered in future research.
Due to confidentiality issues, we were not able to track how or whether participants interacted with members of the VPLC who elected not to consent to research. Thus, the complete network was not represented and data were missing on contact relationships with these particular non-consenting VPLC members. Because relatively low response rates can impact interpretation of social network analyses, these results should be interpreted with caution (blind review). Furthermore, we were not able to assess the reason for the barriers to actual contact, which will be critical information to future implementations of this type of model. Additionally, because this was a small pilot study, we were not able to obtain follow up data to examine whether social networks among providers were maintained.
Finally, the VPLC was time-limited to six-months due to state agency budget constraints related to compensation provided agencies and facilitators. Although these budgetary issues reflect the reality of community mental health practice, including, as noted previously, providing incentives to agencies and not providers, extending the length of the VPLC and revising the incentive structure are priorities for future research.
Summary and Conclusions
The potential of a VPLC designed to support mental health providers’ use of EBPs post-training and implemented within a state system was examined in a study of twenty-five providers across eight agencies throughout the state. Providers contributed to the VPLC content and attendance data supported the feasibility of the VPLC. In contrast, the method to encourage asynchronous information demonstrated a distinct lack of feasibility, given agency policy and technological constraints. Finally, the social network analyses suggested that structures important for implementing innovative practices might have been present in the social network of providers.
In light of the preliminary data presented in this paper, we have several recommendations that agencies and providers might want to consider when contemplating how to support ongoing use of effective practices post-training and consultation. Agencies may want to facilitate peer connections and consultation for providers, paying special attention to including providers who have attended trainings. Sharing knowledge can be powerful and the literature suggests that including opportunities to practice new skills, receive constructive feedback, and discuss challenges are valuable aspects of implementing new knowledge (see Blume, Ford, Baldwin & Huang, 2010). Creating a repository of resources that all providers can access may also be an important aspect of supporting ongoing use of effective practices. Providers may want to prioritize peer consultation, as time permits and exploring existing opportunities for cross-agencies connections, for example attending meetings that promote networking, which might increase access to varied information that could prove useful.
Acknowledgements
We thank our state partners, as well as the community agencies and providers who allowed us to observe and learn from their work.
References
- Aarons GA, Hurlburt M, & Horwitz SM (2011). Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons GA, Sommerfeld DH, Hecht DB, Silovsky JF, & Chaffin MJ (2009). The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology, 77(2), 270–280. 10.1037/a0013223 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ardichvili A (2008). Learning and knowledge sharing in virtual communities of practice: Motivators, barriers, and enablers. Advances in Developing Human Resources, 10(4), 541–554. doi: 10.1177/1523422308319536 [DOI] [Google Scholar]
- Barnett S, Jones SC, Bennett S, Iverson D, & Bonney A (2012). General practice training and virtual communities of practice - A review of the literature. BMC Family Practice, 13 10.1186/1471-2296-13-87 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Barwick MA, Peters J, & Boydell K (2009). Getting to uptake: Do communities of practice support the implementation of evidence-based practice? Journal of the Canadian Academy of Child and Adolescent Psychiatry, 18(1), 16–29. [PMC free article] [PubMed] [Google Scholar]
- Beidas RS, & Kendall PC (2010). Training Therapists in Evidence-Based Practice: A Critical Review of Studies From a Systems-Contextual Perspective. Clinical Psychology: Science and Practice, 17(1), 1–30. 10.1111/j.1468-2850.2009.01187.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bezyak JL, Yan M-C, Kang H-J, Burke J, & Chan F (2014). Communities of practice to improve employment outcomes: a needs assessment. Journal of Occupational Rehabilitation, 24(4), 597–604. [DOI] [PubMed] [Google Scholar]
- Birkel RC, & Reppucci ND (1983). Social networks, information-seeking, and the utilization of services. American Journal of Community Psychology, 11(2), 185–205. [DOI] [PubMed] [Google Scholar]
- Blume BD, Ford JK, Baldwin TT, & Huang JL (2010). Transfer of training: A meta-analytic review. Journal of Management, 36(4), 1065–1105. [Google Scholar]
- Bond GR, Drake RE, McHugo GJ, Peterson AE, Jones AM, & Williams J (2014). Long-Term Sustainability of Evidence-Based Practices in Community Mental Health Agencies. Administration and Policy in Mental Health and Mental Health Services Research, 41(2), 228–236. 10.1007/s10488-012-0461-5 [DOI] [PubMed] [Google Scholar]
- Borgatti SP, Everett MG, & Freeman LC (2002). UCINET 6 for Windows. Harvard: Analytic Technologies, 185. [Google Scholar]
- Bradshaw T, Butterworth A, & Mairs H (2007). Does structured clinical supervision during psychosocial intervention education enhance outcome for mental health nurses and the service users they work with? Journal of Psychiatric & Mental Health Nursing, 14(1), 4–12. doi: 10.1111/j.1365-2850.2007.01021.x [DOI] [PubMed] [Google Scholar]
- Bryk A, Camburn E, & Louis KS (1999). Professional community in Chicago elementary schools: Facilitating factors and organizational consequences. Educational Administration Quarterly, 35(5), 751–781. doi: 10.1177/0013161X99355004 [DOI] [Google Scholar]
- Bryk AS, & Schneider B (2002). Trust in schools: A core resource for improvement. New York, NY: Russell Sage Foundation. [Google Scholar]
- Bunger AC, Hanson RF, Doogan NJ, Powell BJ, Cao Y, & Dunn J (2014). Can Learning Collaboratives Support Implementation by Rewiring Professional Networks? Administration and Policy in Mental Health and Mental Health Services Research, (Journal Article), 1–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Burt RS (1999). The social capital of opinion leaders. The Annals of the American Academy of Political and Social Science, 566(1), 37–54. [Google Scholar]
- Chambers DA, Glasgow RE, & Stange KC (2013). The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implementation Science : IS, 8, 117–117. 10.1186/1748-5908-8-117 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chambers DA, Ringeisen H, & Hickman EE (2005). Federal, state, and foundation initiatives around evidence-based practices child and adolescent mental health. Child and Adolescent Psychiatric Clinics of North America, 14(2), 307–327. doi: 10.1016/j.chc.2004.04.006 [DOI] [PubMed] [Google Scholar]
- Chorpita BF, & Weisz JR (2009). MATCH-ADTC: Modular approach to therapy for children with anxiety, depression, trauma, or conduct problems. PracticeWise. [Google Scholar]
- Daly AJ, Moolenaar NM, Bolivar JM, & Burke P (2010). Relationships in reform: the role of teachers’ social networks. Journal of Educational Administration, 48(3), 359–391. 10.1108/09578231011041062 [DOI] [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(1). 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dochy F, Segers M, Van den Bossche P, & Gijbels D (2003). Effects of problem-based learning: A meta-analysis. Learning and Instruction, 13(5), 533–568. doi: 10.1016/S0959-4752(02)00025-7 [DOI] [Google Scholar]
- Dogan S, Pringle R, & Mesa J (2016). The impacts of professional learning communities on science teachers’ knowledge, practice and student learning: a review. Professional Development in Education, 42(4), 569–588. 10.1080/19415257.2015.1065899 [DOI] [Google Scholar]
- Gesell SB, Barkin SL, Sommer EC, Thompson JR, & Valente TW (2015). Increases in Network Ties Are Associated With Increased Cohesion Among Intervention Participants. Health Education & Behavior : The Official Publication of the Society for Public Health Education, (Journal Article). https://doi.org/1090198115599397 [pii] [DOI] [PMC free article] [PubMed] [Google Scholar]
- Georgenson DL (1982). The problem of transfer calls for partnership. Training & Development Journal, 36(10), 75–78). [Google Scholar]
- Gijbels D, Dochy F, Van den Bossche P, & Segers M (2005). Effects of problem-based learning: A meta-analysis from the angle of assessment. Review of Educational Research, 75, 27–61. doi: 10.3102/00346543075001027 [DOI] [Google Scholar]
- Grady EA, & Chen B (2006). Influences on the size and scope of networks for social service delivery. Journal of Public Administration Research and Theory, 16:4, 533–552. 10.1093/jopart/muj005 [DOI] [Google Scholar]
- Hamilton JD, & Bickman L (2008). A Measurement Feedback System (MFS) Is Necessary to Improve Mental Health Outcomes. Journal of the American Academy of Child & Adolescent Psychiatry, 47(10), 1114–1119. 10.1097/CHI.0b013e3181825af8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hanneman RA, & Riddle M (2005). Introduction to social network methods. Available online at https://www.researchgate.net/profile/Robert_Hanneman/publication/235737492_Introduction_to_Social_Network_Methods/links/0deec52261e1577e6c000000/Introduction-to-Social-Network-Methods.pdf
- Hirsch BJ (1980). Natural support systems and coping with major life changes. American Journal of Community Psychology, 8(2), 159–172. [Google Scholar]
- Hodges K (2003). CAFAS: Manual for training coordinators, clinical administrators, and data managers. Ann Arbor, MI: Kay Hodges. [Google Scholar]
- Kothari A, Boyko JA, Conklin J, Stolee P, & Sibbald SL (2015). Communities of practice for supporting health systems change: a missed opportunity. Health Research Policy and Systems, 13 10.1186/s12961-015-0023-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kratochwill TR, Elliott SN, & Busse RT (1995). Behavior consultation: A five-year evaluation of consultant and client outcomes. School Psychology Quarterly, 10, 87–117. doi: 10.1037/h0088299 [DOI] [Google Scholar]
- Kruse S, Louis KS, & Bryk A (1994). Building professional community in schools. Issues in Restructuring Schools, 6(3), 67–71. [Google Scholar]
- Lam A (2000). Tacit knowledge, organizational learning and societal institutions: an integrated framework. Organization Studies, 21(3), 487–513. [Google Scholar]
- Lomos C, Hofman RH, & Bosker RJ (2011). Professional communities and student achievement—A meta-analysis. School Effectiveness and School Improvement, 22(2), 121–148. 10.1080/09243453.2010.550467 [DOI] [Google Scholar]
- Mairs K, McNeil H, McLeod J, Prorok JC, & Stolee P (2013). Online strategies to facilitate health-related knowledge transfer: a systematic search and review. Health Information & Libraries Journal, 30(4), 261–277. 10.1111/hir.12048 [DOI] [PubMed] [Google Scholar]
- Marin A, & Wellman B (2011). Social network analysis: An introduction. The SAGE Handbook of Social Network Analysis, 11. [Google Scholar]
- Marsden P (1990). Network Data and Measurement. Annual Review of Sociology, 1, 435–463. Retrieved from http://www.jstor.org/stable/2083277 [Google Scholar]
- Marsden PV (2011). Survey methods for network data. The SAGE Handbook of Social Network Analysis, 25, 370–388. [Google Scholar]
- Massatti RR, Sweeney HA, Panzano PC, & Roth D (2008). The De-adoption of Innovative Mental Health Practices (IMHP): Why Organizations Choose not to Sustain an IMHP. Administration and Policy in Mental Health and Mental Health Services Research, 35(1–2), 50–65. 10.1007/s10488-007-0141-z [DOI] [PubMed] [Google Scholar]
- Mayhew BH, & Levinger RL (1976). Size and the density of interaction in human aggregates. American Journal of Sociology, 82(1), 86–110. [Google Scholar]
- Mitchell J, Robinson P, Seiboth C, & Koszegi B (2000). An evaluation of a network for professional development in child and adolescent mental health in rural and remote communities. Journal of Telemedicine and Telecare, 6(3), 158–162. doi: 10.1258/1357633001935257 [DOI] [PubMed] [Google Scholar]
- Moolenaar NM (2012). A social network perspective on teacher collaboration in schools: Theory, methodology, and applications. American Journal of Education, 119(1), 7–39. [Google Scholar]
- Moolenaar NM, Daly AJ, & Sleegers PJ, (2011). Ties with potential: Social network structure and innovative climate in Dutch schools. Teachers College Record, 113(9), 1983–2017. [Google Scholar]
- Moolenaar NM, Sleegers PJC, & Daly AJ (2012). Teaming up: Linking collaboration networks, collective efficacy, and student achievement. Teaching and Teacher Education, 28(2), 251–262. [Google Scholar]
- Nadeem E, Olin SS, Hill LC, Hoagwood KE, & Horwitz SM (2013). Understanding the components of quality improvement collaboratives: A systematic literature review. Milbank Quarterly, 91(2), 354–394. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nadeem E, Weiss D, Olin SS, Hoagwood KE, & Horwitz SM (2016). Using a theory-guided learning collaborative model to improve implementation of EBPs in a state children’s mental health system: A pilot study. Administration and Policy in Mental Health and Mental Health Services Research, 43(6), 978–990. 10.1007/s10488-016-0735-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nelson BB, Teutsch C, Chung PJ, & Herman A (2014). Predictors of Sustained Implementation of Low-Literacy Health Education Programs, (Journal Article). [Google Scholar]
- Novins DK, Green AE, Legha RK, & Aarons GA (2013). Dissemination and implementation of evidence-based practices for child and adolescent mental health: A systematic review. Journal of the American Academy of Child & Adolescent Psychiatry, 52(10), 1009–1025. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palinkas LA, Holloway IW, Rice E, Brown CH, Valente TW, & Chamberlain P (2013). Influence network linkages across implementation strategy conditions in a randomized controlled trial of two strategies for scaling up evidence-based practices in public youth-serving systems. Implementation Science, 8(1). 10.1186/1748-5908-8-133 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palinkas LA, Holloway IW, Rice E, Fuentes D, Wu Q, & Chamberlain P (2011). Social networks and implementation of evidence-based practices in public youth-serving systems: a mixed-methods study. Implementation Science, 6(1), 113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Paradise A 2007. State of the Industry: ASTD’s Annual Review of Trends in Workplace Learning and Performance. Alexandria, VA: ASTD. [Google Scholar]
- Penn DL, Simpson L, Edie G, Leggett S, Wood L, Hawgood J, … De Leo D (2005). Development of ACROSSnet: An online support system for rural and remote community suicide prevention workers in Queensland, Australia. Health Informatics Journal, 11(4), 275–293. [Google Scholar]
- Peterson AE, Bond GR, Drake RE, McHugo GJ, Jones AM, & Williams JR (2014). Predicting the Long-Term Sustainability of Evidence-Based Practices in Mental Health Care: An 8-Year Longitudinal Analysis. The Journal of Behavioral Health Services & Research, 41(3), 337–346. 10.1007/s11414-013-9347-x [DOI] [PubMed] [Google Scholar]
- Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, & Mandell DS (2015). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research. 10.1007/s11414-015-9475-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Raghavan R, Bright CL, & Shadoin AL (2008). Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science, 3(26). doi: 10.1186/1748-5908-3-26 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Saks AM (2002). So what is a good transfer of training estimate? A reply to fitzpatrick. The Industrial-Organizational Psychologist, 39(3), 29–30. [Google Scholar]
- Salas E, Tannenbaum SI, Kraiger K, & Smith-Jentsch KA (2012). The science of training and development in organizations: What matters in practice. Psychological Science in the Public Interest, 13(2), 74–101. 10.1177/1529100612436661 [DOI] [PubMed] [Google Scholar]
- Schoenwald SK, Brown TL, & Henggeler SW (2000). Inside multisystemic therapy: therapist, supervisory, and program practices. Journal of Emotional and Behavioral Disorders, 8(2), 113–127. 10.1177/106342660000800207 [DOI] [Google Scholar]
- Schoenwald SK, Garland AF, Southam-Gerow MA, Chorpita BF, & Chapman JE (2011). Adherence Measurement in Treatments for Disruptive Behavior Disorders: Pursuing Clear Vision Through Varied Lenses: ADHERENCE MEASURES AND TREATMENT SPECIFICATION. Clinical Psychology: Science and Practice, 18(4), 331–341. 10.1111/j.1468-2850.2011.01264.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schoenwald SK, Sheidow AJ, & Chapman JE (2009). Clinical supervision in treatment transport: Effects on adherence and outcomes. Journal of Consulting and Clinical Psychology, 77(3), 410–421. doi: 10.1037/a0013788 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Scott J (2000). Social Network Analysis: A Handbook. London: Sage. [Google Scholar]
- Sheridan SM, Salmon D, Kratochwill TR, & Carrington Rotto PJ (1992). A conceptual model for the expansion of behavioral consultation training. Journal of Educational & Psychological Consultation, 3(3), 193–218. doi: 10.1207/s1532768xjepc0303_1 [DOI] [Google Scholar]
- Smits PBA, Verbeek JHAM, & de Buisonjé CD (2002). Problem based learning in continuing medical education: A review of controlled evaluation studies. BMJ: British Medical Journal, 324(7330), 153–156. doi: 10.1136/bmj.324.7330.153 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, & Charns M (2012). The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science, 7(1), 17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Swan K, & Shea P (2005). The development of virtual learning communities In Hiltz SR, & Goldman R (Eds.), Learning together online: Research asynchronous learning networks (pp. 239–260). Mahwah, NJ: Lawrence Erlbaum Associates Publishers. [Google Scholar]
- Valente TW (2012). Network interventions. Science, 337:6090, 49–53. [DOI] [PubMed] [Google Scholar]
- Valente TW, Palinkas LA, Czaja S, Chu KH, & Brown CH (2015). Social network analysis for program implementation. PloS One, 10(6), e0131712 10.1371/journal.pone.0131712 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vernon DT, & Blake RL (1993). Does problem-based learning work? A meta-analysis of evaluative research. Academic Medicine, 68(7), 550–563. doi: 10.1097/00001888-199307000-00015 [DOI] [PubMed] [Google Scholar]
- Vescio V, Ross D, & Adams A (2008). A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education, 24(1), 80–91. doi: 10.1016/j.tate.2007.01.004 [DOI] [Google Scholar]
- Wasserman S, & Faust K (1994). Social network analysis: Methods and applications. Cambridge, MA: Cambridge University Press. [Google Scholar]
- Wenger E (1998). Communities of practice: learning, meaning, and identity. Cambridge UP: Cambridge: Retrieved from 10.1017/CBO9780511803932 [DOI] [Google Scholar]
- Wiltsey Stirman S, Finley EP, Shields N, Cook J, Haine-Schlagel R, Burgess JF, … Monson C (2017). Improving and sustaining delivery of CPT for PTSD in mental health systems: a cluster randomized trial. Implementation Science, 12(1). 10.1186/s13012-017-0544-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yousefi-Nooraie R, Dobbins M, Marin A, Hanneman R, & Lohfeld L (2015). The evolution of social networks through the implementation of evidence-informed decision-making interventions: a longitudinal analysis of three public health units in Canada. Implementation Science, 10(1), 166-15-355-5 10.1186/s13012-015-0355-5 [DOI] [PMC free article] [PubMed] [Google Scholar]