Abstract
BACKGROUND
Research on how best to deliver efficacious public health strategies in heterogeneous community and organizational contexts remains limited. Such studies require the active engagement of public health practice settings in the design, implementation and translation of research. Practice-based research networks (PBRNs) provide mechanisms for research engagement, but until now they have not been tested in public health settings.
PURPOSE
This study uses data from participants in 14 public health PBRNs and a national comparison group of public health agencies to study processes influencing the engagement of public health settings in research implementation and translation activities.
METHODS
A cross-sectional network analysis survey was fielded with participants in public health PBRNs approximately one year after network formation (n=357) and with a nationally representative comparison group of U.S. local health departments not participating in PBRNs (n=625). Hierarchical regression models were used to estimate how organizational attributes and PBRN network structures influence engagement in research implementation and translation activities.
RESULTS
Among PBRN participants, both researchers and practice agencies reported high levels of engagement in research activities. Local public health agencies participating in PBRNs were two to three times more likely than non-participating agencies to engage in research implementation and translation activities (p<0.05). Participants in less-densely connected PBRN networks and in more peripheral locations within these networks reported higher levels of research engagement, greater perceived benefits from engagement, and greater likelihood of continued participation.
CONCLUSIONS
PBRN networks can serve as effective mechanisms for facilitating research implementation and translation among public health practice settings.
Introduction
Public health programs and prevention policies remain controversial components of the nation’s health reform strategy, in large part because of uncertainties about their effectiveness in reducing disease burden and constraining growth in national health spending.2,3 Achieving meaningful health and economic benefits from investments in prevention and public health requires knowledge about which strategies actually support improved health, at what cost, and how best to deliver these strategies to the populations that can benefit from them.4 An expanding body of research-tested prevention programs and policies exists, such as those profiled in the CDC’s Guide to Community Prevention Services5, but large gaps persist in the adoption and implementation of these strategies across states and communities.6–12 Moreover, public health professionals are often called to act against health threats for which few if any evidence-based strategies exist, or to act in settings where evidence-based strategies are logistically, politically or economically infeasible. In these situations, innovations in public health practice occur but without the comparative research necessary to determine their impact and value.13
These missed opportunities for evidence-based practice and practice-based evidence emphasize the need for “delivery system research” that indicates how best to organize, finance, and deliver public health strategies in real-world practice settings.14,15 The need for delivery system research in public health is particularly acute given that public health strategies are delivered through the combined efforts of multiple governmental agencies and their private-sector and community-based counterparts, through complex relationships and using resources that vary widely across states and communities and that evolve over time.16–20 Strategies that are easily implemented in one setting often face barriers in other settings.21 Expanded delivery system research can elucidate which strategies and adaptations work best in which settings and for which populations.
Practice-Based Research Networks
Delivery system research in public health settings requires the active engagement of public health organizations in the design, implementation, and application of these studies, but historically such engagement has been limited. Data from the CDC’s National Public Health Performance Standards Program, for example, consistently indicate that state and local public health organizations are much less likely to achieve national standards in research and evaluation than in other domains of practice.10,11,22,23 Periodic national surveys of governmental public health agencies find similarly low levels of research engagement, particularly at the local level.24,25 To expand delivery system research in public health settings, the Robert Wood Johnson Foundation launched the Public Health Practice-Based Research Networks Program in 2008.26 Public health practice-based research networks (PBRNs) bring together public health agencies and academic researchers to study the organization, financing, and delivery of public health strategies in real-world practice settings, with the goal of producing actionable evidence that can be used to improve practice and policy.27
Practice-based research networks (PBRNs) have been used in medical care research for more than three decades to support delivery system research in clinical settings.28,29 These clinical PBRNs allow community-based health care providers and their staffs to collaborate with researchers in designing, implementing, evaluating, and diffusing solutions to real-world problems in clinical practice.30,31 The experience of the PBRN model in clinical settings suggests that it may also be useful in public health settings to accelerate the production and application of evidence regarding public health delivery.27 Participating practitioners and researchers collaborate to identify pressing research questions of interest, design rigorous and relevant studies, execute research effectively, and translate findings rapidly into practice. Beginning in 2008, the Robert Wood Johnson Foundation’s Public Health PBRN Program supported the development of 12 research networks comprised of local and state governmental public health agencies, community partners, and collaborating academic research institutions. These supported PBRNs are located in Colorado, Connecticut, Florida, Kentucky, Massachusetts, Minnesota, Nebraska, New York, North Carolina, Ohio, Washington, and Wisconsin. Additional public health PBRNs participate in the program as affiliate members and emerging networks under development, with the affiliate networks in Georgia, Missouri, New Jersey, and Tennessee progressing to the point of receiving research support from the PBRN Program. Counting both supported and emerging networks, public health PBRNs are currently operational in 28 states, covering more than 1000 state and local public health agencies and 35 universities across the U.S.26
This analysis examines the experience of PBRNs in engaging public health organizations in the design, implementation, and translation of delivery system research during their initial two years of development. Specifically, this analysis: (1) examines differences between academic and practitioner PBRN participants in the nature and intensity of engagement in research implementation activities; (2) compares research engagement among local public health practitioners that do and do not participate in PBRNs; and (3) assesses the influence of individual, organizational, and network characteristics on research implementation activities and experiences among PBRN participants. Results offer insight into the current and potential roles of PBRNs in expanding research implementation and translation in public health practice settings.
Methodology and Data
Study Population and Sampling
A cross-sectional, self-administered survey was validated and fielded with representatives of public health organizations that participate in one of 14 public health PBRNs. The survey was fielded approximately one year after each network formed, with five PBRNs surveyed during 2010–11, and nine PBRNs surveyed during 2011–12. A total of 357 people representing these organizations were identified by PBRN leaders as active participants in one of the 14 PBRNs, using a standard case definition of network participation that included meeting attendance and service on research teams and steering committees (see Table 1, Types of PBRN Participation). These individuals were contacted by email and asked to complete the web-based survey instrument. A total of 209 people (59%) provided usable responses to the survey, including 103 representatives from local health departments, 37 representatives from state health agencies, and 76 representatives from academic institutions.
Table 1.
All Participants | Practitioners | Researchers | |||||
---|---|---|---|---|---|---|---|
Variable | %/Mean | S.D. | %/Mean | S.D. | %/Mean | S.D. | |
Type of organization(s) where employed (1) | |||||||
Academic/research unit | 36.4% | 0.0% | 100.0% | *** | |||
State government agency | 17.7% | 24.8% | 5.3% | *** | |||
Local government agency | 41.1% | 63.9% | 1.3% | *** | |||
Federal agency | 1.0% | 1.5% | 0.0% | ||||
Professional association | 7.7% | 11.3% | 1.3% | *** | |||
Community-based organization | 3.3% | 4.5% | 1.3% | ||||
Types of PBRN participation (1) | |||||||
Serve as key staff in PBRN | 18.7% | 16.5% | 22.4% | ||||
Serve on PBRN committee/workgroup | 49.8% | 52.6% | 44.7% | ||||
Represent a founding organization | 16.3% | 13.5% | 21.1% | ||||
Participate in routine meetings/calls | 64.1% | 63.2% | 65.8% | ||||
Participate in PBRN research projects | 51.2% | 41.4% | 68.4% | *** | |||
Number of staff that participate in PBRN | 3.4 | 4.8 | 3.1 | 4.8 | 3.9 | 4.7 | |
Orientation to research vs. practice (2) | 4.7 | 1.9 | 5.4 | 1.5 | 3.5 | 2.0 | *** |
Prior experience with research implementation (3) | |||||||
Identifying research topics | 3.4 | 1.0 | 3.2 | 1.0 | 3.9 | 0.9 | |
Planning/designing studies | 3.1 | 1.2 | 2.7 | 1.1 | 3.7 | 1.0 | *** |
Securing funding/grantwriting | 2.9 | 1.2 | 2.6 | 1.1 | 3.5 | 1.1 | *** |
Implementing studies | 3.0 | 1.3 | 2.5 | 1.1 | 3.8 | 1.1 | *** |
Disseminating study results | 3.3 | 1.1 | 3.1 | 1.1 | 3.7 | 1.1 | *** |
Applying findings in own organization | 3.2 | 1.1 | 3.3 | 1.1 | 2.9 | 1.1 | ** |
Helping others apply findings | 2.8 | 1.1 | 2.7 | 1.1 | 3.1 | 1.1 | *** |
Composite experience measure | 9.8 | 11.5 | 19.6 | 6.4 | 24.2 | 5.8 | *** |
N | 209 | 133 | 76 |
Note: differences between Practitioner and Researcher columns are statistically significant at:
p<0.01,
p<0.05
Rows do not sum to 100% because some participants fall into multiple categories.
Chi-square tests are used for statistical significance.
Seven-point ordinal scale. Wilcoxon-Mann-Whitney tests are used for statistical significance.
Five-point ordinal scale. Wilcoxon-Mann-Whitney tests are used for statistical significance.
A subset of survey items was included on a 2010 survey conducted by the National Association of County and City Health Officials (NACCHO) and administered to a stratified random sample of U.S. local health departments.24 Departments were classified into one of seven strata based on the size of the population served, and randomly sampled without replacement using sampling rates proportional to population size, resulting in an overall sampling rate of 24% of 2565 total departments (n=625). A total of 505 agency representatives (81%) responded to the NACCHO survey. The NACCHO survey asked the director of each local health department to complete the survey or designate an alternative respondent who has equivalent knowledge of agency activities. By comparison, the PBRN survey solicited responses from all individuals identified by PBRN leaders as active network participants, resulting in responses from agency directors in 98 of the 103 local health departments responding to the PBRN survey (95%). Both surveys were administered via the web, with respondent notification, recruitment, and follow-up conducted with email and telephone contact.
Measures
Both the PBRN and NACCHO survey instruments included a common set questions about the agency’s past and current experiences with research implementation. Developed through focus groups with PBRN leaders, the research implementation questions included eight items identified as reflecting core components of the research process: (1) convening key stakeholders; (2) identifying research topics; (3) planning and designing studies; (4) grant-writing and securing funding; (5) implementing studies through collection, analysis, and interpretation of data; (6) disseminating study results; (7) applying findings within one’s own organization; and (8) helping other organizations apply findings. For each item, a seven-point ordinal response scale measured the frequency of participation in each activity during the past 12 months, ranging from none to weekly participation. A composite measure of research implementation breadth was constructed by converting each item to a dichotomous none/any scale and calculating the proportion of items reported with any participation in the past 12 months. Similarly, a composite measure of research implementation intensity was constructed by calculating the weighted average value of participation frequency across the 8 items. In constructing each composite measure, a weight was assigned to each of the eight items using values from a previous expert panel study that rated the perceived importance of engaging practice settings in each of the 8 research implementation items.32
Additionally, the PBRN survey included questions about the types of roles played in PBRN research implementation, the frequency and types of interaction with other PBRN participants for research implementation, and the perceived benefits of PBRN participation. The survey defines PBRN participants at the organizational level based on the primary institution each individual participant represents, including local and state public health agencies, community organizations, professional associations, and academic institutions. The survey instrument provided seven-point ordinal response scales to measure the frequency of interaction between each pair of PBRN participants, ranging from none to weekly interaction. Responses for individual survey items indicated the frequency with which each PBRN participant reported working with each other participant on research implementation activities during the prior 12 months. Pilot testing and validation of the survey instrument in one PBRN confirmed a test-retest correlation coefficient of 0.84 and strong face validity of measures based on cognitive interviews conducted with 15 pilot survey respondents.
Following standard methods of network analysis, survey data were used to construct composite measures of network structure and connectedness for each PBRN and its participating organizations.33,34 In cases were multiple people from the same organization responded to the survey, these responses were averaged into a single organization-level response in order to construct network analysis measures. For each network, network density was measured as the number of interactions between all pairs of organizations in the network, as a proportion of the total possible number of interactions. Average path length was measured as the average number of organizations that lie on the shortest path connecting each pair of organizations in the network, where the shortest path is defined as the connection that passes through the fewest intermediary organizations. Network cohesion (or breadth) was measured as the sum of the reciprocal of the path lengths connecting each pair of organizations in the network. Network centralization was measured as the extent to which connections between pairs of organizations were mediated by a single influential organization in the network. Out-degree centralization, which indicates how frequently each organization reports interacting with others in the network (i.e. internal perceptions of network influence), is distinguished from in-degree centralization, which reflects how frequently others in the network report interacting with each organization (i.e. external perceptions of network influence).
Additionally, several organization-level measures of network connectedness and influence were constructed for each PBRN participant. Organizational degree centrality was defined as the total number of connections that each organization maintained with other organizations in the network, as a percentage of the total possible connections. Out-degree centrality was distinguished from in-degree centrality in this measure, yielding both internal and external perceptions of organizational influence. Organizational betweenness centrality indicated the extent to which an organization serves as a bridge between pairs of other organizations in the network, and was computed as the number of times an organization lies on the shortest path connecting pairs of other organizations in the network, divided by the total possible number of times that this could occur in the network. All network analysis measures were calculated using UCINET software version 6.08.35
Analysis
Four analytic strategies were used to examine the experience of PBRNs in engaging public health organizations in research design, implementation, and translation. First, PBRN participants were stratified into two groups based on whether their primary employment was located in an academic or research organization (researchers) versus in public health practice organization (practitioners). The types and intensities of research engagement were compared across these groups using chi-square tests for categorical measures and Wilcoxon-Mann-Whitney nonparametric tests for ordinal and interval measures. Second, PBRN participants from local public health agencies were compared to the NACCHO national sample of local public health agency respondents who did not participate in PBRNs to examine differences in research engagement, using chi-square and Wilcoxon-Mann-Whitney tests. Third, measures of network structure and connectedness were compared across the 14 PBRNs and across five types of participating organizations to examine variation in patterns of interaction for research implementation. Finally, multivariate generalized estimating equations (GEE) were used to estimate the influence of individual, organizational, and network characteristics on research implementation activities and experiences among PBRN participants, controlling for the clustering of participants within networks.
Results
Research Engagement among PBRN Participants
Approximately 40% of the 209 responding PBRN participants worked in local public health agencies, compared to 36% from academic/research organizations and 18% from state health agencies (Table 1). As expected, researchers reported more prior experience with research implementation activities than did practitioners, and researchers rated themselves as more oriented to these types of activities on the research-to-practice continuum than did their counterparts in practice settings. Overall, both researchers and practitioners reported high levels of engagement in PBRN research design and implementation activities over the prior 12 months, with 94% of practitioners and 97% of researchers reporting engagement in identifying PBRN research topics, and 77% and 96% reporting involvement in implementing data collection and analysis (Table 2). However, the composite measures of breadth and intensity of involvement in research implementation activities were moderately higher among researchers than among practitioners (p<0.05). Both researchers and practitioners reported high levels of alignment between PBRN research priorities and their own interests.
Table 2.
All Participants | Practitioners | Researchers | |||||
---|---|---|---|---|---|---|---|
Variable | Mean/% | S.D. | Mean/% | S.D. | Mean/% | S.D. | |
Duration of PBRN involvement (years) | 1.8 | 1.6 | 2.0 | 1.5 | 1.8 | 1.4 | |
Any involvement in research implementation - past 12 months (1) | |||||||
Convening stakeholders | 80.7% | 77.0% | 86.6% | ||||
Identifying research topics | 94.9% | 93.6% | 97.0% | ||||
Planning/designing studies | 86.3% | 78.9% | 98.5% | *** | |||
Securing funding/grantwriting | 81.9% | 72.8% | 97.0% | *** | |||
Implementing studies | 84.0% | 77.2% | 95.5% | *** | |||
Disseminating study results | 87.9% | 82.5% | 97.0% | *** | |||
Applying findings in own organization | 82.4% | 85.1% | 77.6% | ||||
Helping others apply findings | 81.2% | 74.3% | 92.5% | *** | |||
Composite measure: implementation breadth | 86.0 | 25.2 | 81.8 | 28.6 | 93.0 | 16.5 | *** |
Intensity of involvement in research implementation - past 12 months (2) | |||||||
Convening stakeholders | 3.0 | 1.4 | 2.8 | 1.5 | 3.3 | 1.4 | |
Identifying research topics | 3.3 | 1.1 | 3.2 | 1.0 | 3.6 | 1.1 | *** |
Planning/designing studies | 2.9 | 1.3 | 2.5 | 1.2 | 3.6 | 1.2 | *** |
Securing funding/grantwriting | 2.9 | 1.4 | 2.5 | 1.4 | 3.7 | 1.2 | *** |
Implementing studies | 3.0 | 1.4 | 2.6 | 1.3 | 3.7 | 1.3 | *** |
Disseminating study results | 3.2 | 1.3 | 2.9 | 1.3 | 3.8 | 1.2 | *** |
Applying findings in own organization | 2.9 | 1.3 | 3.1 | 1.3 | 2.6 | 1.3 | ** |
Helping others apply findings | 2.7 | 1.2 | 2.5 | 1.2 | 3.0 | 1.2 | *** |
Composite measure: implementation intensity | 20.7 | 11.3 | 18.7 | 10.8 | 24.1 | 11.3 | *** |
Alignment of PBRN research with own interests (2) | 4.9 | 1.6 | 4.6 | 1.8 | 5.3 | 1.2 | ** |
PBRN consideration given to your ideas (2) | 3.8 | 1.6 | 3.8 | 1.6 | 3.9 | 1.6 | |
Perceived benefit of PBRN participation (2) | 4.1 | 0.9 | 4.0 | 0.9 | 4.4 | 0.8 | *** |
Likelihood of continuing PBRN participation (3) | 6.2 | 1.2 | 6.0 | 1.3 | 6.5 | 0.8 | ** |
N | 209 | 133 | 76 |
Note: differences between Practitioner and Researcher columns are statistically significant at:
p<0.01,
p<0.05
Chi-square tests are used to assess statistical significance.
Five-point ordinal scale. Wilcoxon-Mann-Whitney tests are used to assess statistical significance.
Seven-point ordinal scale. Wilcoxon-Mann-Whitney tests are used to assess statistical significance.
PBRN Participants Compared to Non-Participants
Local public health agencies who participate in PBRNs reported markedly higher levels of engagement in research implementation activities compared to a national sample of agencies not participating in PBRNs (Table 3). PBRN participants were more than three times as likely as nonparticipants to engage in identifying research topics, and more than five times more likely to engage in planning and designing studies (p<0.01). The mean composite measure of research implementation was 2.8 times larger among PBRN participants than among non-participants. These large differences in research implementation persisted after adjusting for differences in agency expenditures, population size of jurisdiction, per capita income in jurisdiction, and rural/urban location.
Table 3.
PBRN Agencies | National Sample | ||||
---|---|---|---|---|---|
Variable | Percent/Mean (S.D.) | Percent/Mean (S.D.) | |||
Identifying research topics | 94.1% | 27.5% | *** | ||
Planning/designing studies | 81.6% | 15.8% | *** | ||
Implementing recruitment, data collection & analysis | 79.6% | 50.3% | ** | ||
Disseminating study results | 84.5% | 36.6% | ** | ||
Applying findings in own organization | 87.4% | 32.1% | ** | ||
Helping others apply findings | 76.5% | 18.0% | *** | ||
Research implementation composite measure | 84.04 | (27.38) | 30.20 | (31.38) | ** |
N | 103 | 505 |
Patterns of Interaction within Networks
The 14 PBRNs exhibited considerable variation in network structure and connectedness, indicating broad heterogeneity in the patterns of interaction among participating researchers and practitioners (Table 4). The density of connections among PBRN participants ranged from a low of 14% in Colorado to a high of 93% in Washington. The degrees of cohesion and average distance between participants, however, were much less variable across networks and indicated relatively low levels of fragmentation within PBRNs. Network centralization (in-degree) was more than twice as high in the Florida PBRN as in the Ohio PBRN, and average betweenness centrality was more than 20 times greater in the Colorado network than in the Missouri network. These measures indicate a higher reliance on centralized “hub” organizations in some PBRNs, while other networks rely more heavily on mediating “bridge” organizations to facilitate research interaction. Across all networks, local and state public health agencies had significantly lower levels of betweenness centrality than did academic/research organizations (Table 4), indicating the relatively peripheral positions of practice-based agencies within their networks.
Table 4.
Network-Level Measures | Organization-level Measures | ||||||||
---|---|---|---|---|---|---|---|---|---|
Average | Centralization | Centralization | Centrality | Centrality | |||||
Network | N | Density | Breadth | Distance | In-Degree | Out-Degree | In-Degree | Out-Degree | Betweeness |
Colorado | 46 | 0.14 | 0.50 | 2.24 | 9.33 | 28.23 | 3.95 (3.89) |
5.43 (9.53) |
20.03 (52.06) |
Connecticut | 62 | 0.33 | 0.35 | 1.69 | 4.88 | 44.97 | 5.39 (1.48) |
5.39 (12.57) |
0.60 (3.91) |
Florida | 23 | 0.74 | 0.16 | 1.31 | 20.00 | 22.95 | 15.17 (6.94) |
17.70 (17.81) |
1.14 (2.89) |
Georgia | 14 | 0.39 | 0.37 | 1.80 | 11.48 | 60.37 | 12.62 (6.82) |
14.05 (17.94) |
1.29 (4.03) |
Kentucky | 43 | 0.25 | 0.42 | 1.98 | 10.92 | 29.23 | 5.86 (5.17) |
5.86 (9.6) |
14.23 (34.71) |
Massachusetts | 18 | 0.48 | 0.30 | 1.65 | 11.50 | 45.50 | 18.89 (10.24) |
20.42 (23.62) |
1.94 (4.65) |
Minnesota | 19 | 0.33 | 0.50 | 1.76 | 12.76 | 26.48 | 11.93 (7.63) |
13.82 (15.24) |
1.84 (6.15) |
Missouri | 21 | 0.66 | 0.20 | 1.40 | 7.83 | 65.92 | 14.46 (2.36) |
19.58 (24.34) |
0.00 (0.00) |
North Carolina | 26 | 0.71 | 0.21 | 1.28 | 12.99 | 43.69 | 20.25 (8.17) |
21.53 (22.2) |
2.63 (4.88) |
Nebraska | 32 | 0.22 | 0.47 | 1.93 | 9.51 | 62.73 | 8.46 (5.84) |
10.22 (14.92) |
0.47 (2.65) |
New York | 23 | 0.61 | 0.22 | 1.45 | 9.58 | 42.17 | 13.42 (4.99) |
21.25 (22.47) |
1.74 (2.55) |
Ohio | 67 | 0.46 | 0.29 | 1.59 | 7.19 | 41.96 | 7.70 (3.4) |
8.59 (11.87) |
5.08 (10.39) |
Washington | 20 | 0.93 | 0.07 | 1.14 | 17.58 | 45.60 | 33.43 (10.28) |
37.14 (22.8) |
0.85 (2.52) |
Wisconsin | 30 | 0.59 | 0.26 | 1.40 | 8.03 | 36.41 | 8.89 (3.46) |
10.73 (13.73) |
0.62 (1.44) |
Mean Values by Organization Type | |||||||||
Local agencies | 218 | 0.44 (0.21) |
0.32 (0.12) |
1.66 (0.28) |
9.70 (4.39) |
41.08 (11.01) |
10.00 (8.29) |
10.35 (16.17) |
2.91 (13.81) |
State agencies | 44 | 0.47 (0.21) |
0.33 (0.13) |
1.63 (0.3) |
10.84 (3.07) |
42.29 (12.49) |
15.62 (8.51) |
19.87 (20.7) |
3.09 (7.85) |
Academic units | 115 | 0.45 (0.21) |
0.32 (0.12) |
1.67 (0.31) |
9.72 (3.21) |
41.89 (12.56) |
11.17 (8.54) |
15.87 (17.75) |
11.54 (36.02) |
Associations | 31 | 0.45 (0.22) |
0.33 (0.12) |
1.68 (0.32) |
10.27 (2.82) |
42.42 (11.18) |
10.64 (9.97) |
8.89 (16.15) |
1.87 (8.62) |
Other | 36 | 0.40 (0.19) |
0.35 (0.11) |
1.75 (0.31) |
9.57 (2.75) |
38.98 (13.03) |
6.65 (8.37) |
7.42 (18.4) |
0.77 (4.5) |
NOTE: Standard deviations are shown in parentheses
Factors Associated with Research Implementation Experiences
Multivariate estimates indicated that the research implementation experiences of PBRN participants varied significantly with selected individual, organizational, and network characteristics (Table 5). At the individual level, participants’ prior research experience was strongly and positively associated with the breadth and intensity of engagement in PBRN research implementation and with the likelihood of future PBRN participation, after controlling for other factors (pπ.05). An individual’s duration of participation in the PBRN, however, was not associated with any of the four research implementation experience measures examined. At the organizational level, participants from local public health agencies reported significantly lower breadth of engagement in research implementation and significantly lower perceived benefits of engagement compared to participants from other types of organizations (p<0.05).
Table 5.
Implementation Breadth |
Implementation Intensity |
Perceived Benefit | Future Participation |
|||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Variable | Coef. | S.E. | Coef. | S.E. | Coef. | S.E. | Coef. | S.E. | ||||
Prior research experience | 0.250 | 0.099 | ** | 0.252 | 0.093 | ** | 0.005 | 0.009 | 0.043 | 0.018 | ** | |
Research-practice orientation | −0.481 | 0.308 | 0.042 | 0.285 | 0.007 | 0.044 | 0.130 | 0.076 | ||||
Duration of PBRN participation | 0.001 | 0.001 | −0.001 | 0.001 | 0.000 | 0.000 | 0.000 | 0.000 | ||||
Type of organization (academic=reference) | ||||||||||||
Local government | −4.836 | 1.380 | *** | −0.785 | 1.231 | −0.465 | 0.121 | *** | −0.611 | 0.162 | *** | |
State government | −0.924 | 1.998 | −0.739 | 1.384 | −0.186 | 0.132 | −0.500 | 0.271 | ||||
Association | −1.173 | 1.614 | 0.488 | 1.420 | −0.193 | 0.295 | −0.310 | 0.296 | ||||
Other | −4.658 | 2.536 | −2.201 | 1.670 | −0.236 | 0.530 | −0.213 | 0.700 | ||||
Network density | −10.909 | 4.242 | ** | −2.980 | 3.234 | −0.582 | 0.398 | −1.105 | 0.464 | ** | ||
Network centralization | 0.330 | 0.247 | 0.209 | 0.166 | −0.005 | 0.021 | −0.002 | 0.024 | ||||
Organization degree centrality | 0.179 | 0.048 | ** | 0.060 | 0.029 | ** | 0.013 | 0.004 | *** | 0.021 | 0.007 | ** |
Organization betweeness | −0.031 | 0.020 | −0.028 | 0.007 | ** | −0.004 | 0.001 | ** | −0.003 | 0.001 | ** | |
Constant | 18.405 | 2.772 | *** | 5.311 | 2.614 | ** | 3.923 | 0.410 | *** | 4.858 | 0.630 | *** |
F | 30.970 | 5.370 | 58.530 | 16.600 | ||||||||
R square | 0.323 | 0.144 | 0.138 | 0.194 | ||||||||
N | 174.000 | 174.000 | 174.000 | 174.000 |
Note: regression models adjust for clustering of participants within networks
p<0.01,
p<0.05
Regarding network characteristics, the density of the PBRN network was negatively associated both with the breadth of research activities implemented by PBRN participants and with participants’ likelihood of future participation in PBRN research (p<0.05). Organizations having a larger volume of connections to other PBRN participants, as indicated by their out-degree centrality, reported higher breadth and intensity of research implementation, as well as higher perceived benefits and likelihood of future PBRN participation (p<0.05). Conversely, the betweenness centrality of participating PBRN organizations was inversely associated with research implementation experiences, indicating that organizations located in the periphery of their networks engaged more intensively in PBRN research implementation and experienced larger benefits from this engagement, compared to organizations occupying intermediary positions within PBRN networks.
Discussion
PBRNs have experienced notable successes in convening broad networks of researchers and practitioners from public health settings and engaging these stakeholders in research implementation and translation activities during their initial years of development. This success appears particularly notable among local public health agencies, which historically have had very low rates of research engagement despite the central roles they play in U.S. public health delivery. Local agencies represented the largest single component of public health PBRN participants in this study, and their research engagement extended beyond ancillary roles in study recruitment and data collection to include substantive roles in identifying research priorities, designing and implementing studies, and applying study findings to practice. In particular, PBRN participants from local agencies were more likely than all other types of participants to report applying research findings within their own organizations, reflecting a key research translation goal of the PBRN model.
Local public health agencies who participate in PBRNs reported rates of engagement in research implementation and translation activities that far exceeded the rates observed among a nationally representative sample of agencies who do not participate in PBRNs—often by more than 200 percent. These differences may reflect, at least in part, the success of PBRNs in selecting and attracting those agencies with the motivation, skills, and resources to conduct research. However, the relatively low levels of prior research experience reported by participating local public health agencies suggest that PBRNs achieve their success in research engagement not only through selection but also through facilitation and capacity-building. The cross-sectional, observational design of this study precludes a definitive determination of how much of the PBRNs’ success with research engagement is attributable to selection vs. facilitation and capacity-building, but both of these mechanisms are likely to be beneficial in promoting practice-based research.
The 14 PBRNs examined in this study varied considerably in their composition and patterns of interaction, and multivariate results suggested that these structural features have implications for the research experiences and benefits that accrue to PBRN participants. Overall, participants from local public health agencies reported lower levels of research engagement and lower perceived benefits compared to participants from other types of organizations, indicating a need for targeted approaches to improve the research experiences of local public health agencies. Moreover, this study finds more positive research experiences among lower-density PBRN networks, among highly connected organizations within networks, and among organizations located in the periphery of their networks. Collectively, these findings suggest that the benefits of PBRN participation do not necessarily accrue through the efficient exchange of information that dense networks provide; rather, benefits accrue through connections to diverse network participants who contribute novel ideas, resources and perspectives to the research process. Moreover, PBRN participants in the core and the periphery of their networks appear to benefit more than those in the middle. These findings suggest that intermediary organizations – those serving as bridges between otherwise unconnected components of a network – may require targeted approaches to support and improve their research experiences. The strong association between prior research experience and current perceived benefits of PBRN participation suggests that PBRN involvement may become self-reinforcing as more organizations build research capacity through the networks.
In light of these findings, several strategies are likely to be important for the continued development of public health PBRNs and the utility of the evidence they produce. First, PBRNs should seek to expand the number and diversity of practice settings included in their networks, adding more peripheral organizations and reducing their reliance on small numbers of densely connected organizations with long-standing partnerships. This type of growth also will enhance PBRN capacity to implement large-scale research projects that provide more definitive empirical evidence (stronger internal validity) and that generalize or transfer to a wider array of public health practice settings (stronger external validity). Second, PBRNs should seek to enhance participation incentives and supports for local public health agencies and intermediary organizations, which appear most vulnerable to attrition over time. Such supports may include targeted financial and technical assistance, enhanced access to novel information and research findings, and expanded public recognition through publications, professional meetings, awards, and accreditation and credentialing programs.
The PBRNs in this study are still early in their developmental stages and focus primarily on conducting small-scale, descriptive, and comparative studies of public health delivery. Whether the active patterns of research engagement observed in this study will persist as networks mature toward more complex and resource-intensive studies remain to be seen. Additionally, the findings from this study of necessity rely on respondent self-selection and, therefore, likely reflect the experiences of the most active and motivated PBRN participants; thus, the conclusions drawn from collected data may not generalize to the experiences of less-engaged participants who did not respond to the survey. The intriguing but complex findings concerning PBRN network structures and perceived benefits highlight the need for more granular, qualitative studies of network dynamics. Nevertheless, this study suggests that PBRN networks can serve as effective mechanisms for facilitating research design, implementation, and translation in real-world public health practice settings. As such, they offer important laboratories for helping the public health system learn how best to deliver strategies that improve population health.
Acknowledgements
This research was funded by the Robert Wood Johnson Foundation (Grant #64676). Glen Mays also was supported through a Clinical and Translational Science Award from the National Institutes of Health (Award UL1TR000117). Anna Hoover also was supported through the through the National Institute of Environmental Health Sciences Superfund Research Program (Award P42 ES007380). Contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIH.
Contributor Information
Glen P. Mays, Department of Health Services Management, College of Public Health, University of Kentucky, Lexington, KY.
Rachel A. Hogg, Department of Health Services Management, College of Public Health, University of Kentucky, Lexington, KY.
Doris M. Castellanos-Cruz, Department of Health Services Management, College of Public Health, University of Kentucky, Lexington, KY.
Anna G. Hoover, Department of Health Services Management, College of Public Health, University of Kentucky, Lexington, KY.
Lizeth C. Fowler, Department of Health Services Management, College of Public Health, University of Kentucky, Lexington, KY.
REFERENCES
- 1.U.S. Centers for Disease Control and Prevention. Chronic diseases: the power to prevent, the call to control. Atlanta, GA: CDC; 2009. [Google Scholar]
- 2.Rosenbaum S. The Patient Protection and Affordable Care Act: implications for public health policy and practice. Public Health Rep. 2011;126:130–135. doi: 10.1177/003335491112600118. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Haberkorn J. Health Policy Brief: The Prevention and Public Health Fund. Health Affairs. 2012 Feb 23; 2012. [Google Scholar]
- 4.Mays GP, Scutchfield FD. Advancing the science of delivery: public health services and systems research. J Public Health Manag Pract. 2012;18:481–484. doi: 10.1097/PHH.0b013e31826833ad. [DOI] [PubMed] [Google Scholar]
- 5.Briss PA, Brownson RC, Fielding JE, Zaza S. Developing the Guide to Community Preventive Services: Lessons Learned About Evidence-Based Public Health. Annual Review of Public Health. 2004;25:281–302. doi: 10.1146/annurev.publhealth.25.050503.153933. [DOI] [PubMed] [Google Scholar]
- 6.Mays GP, Smith SA, Ingram RC, Racster LJ, Lamberth CD, Lovely ES. Public health delivery systems: evidence, uncertainty, and emerging research needs. Am J Prev Med. 2009;36:256–265. doi: 10.1016/j.amepre.2008.11.008. [DOI] [PubMed] [Google Scholar]
- 7.Mays GP, Halverson PK, Baker EL, Stevens R, Vann JJ. Availability and perceived effectiveness of public health activities in the nation's most populous communities. American Journal of Public Health. 2004;94:1019–1026. doi: 10.2105/ajph.94.6.1019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Turnock BJ, Handler A, Hall W, Potsic S, Nalluri R, Vaughn EH. Local health department effectiveness in addressing the core functions of public health. Public Health Rep. 1994;109:653–658. [PMC free article] [PubMed] [Google Scholar]
- 9.Turnock BJ, Handler AS, Miller CA. Core function-related local public health practice effectiveness. J Public Health Manag Pract. 1998;4:26–32. doi: 10.1097/00124784-199809000-00005. [DOI] [PubMed] [Google Scholar]
- 10.Mays GP, McHugh MC, Shim K, et al. Identifying dimensions of performance in local public health systems: results from the National Public Health Performance Standards Program. J Public Health Manag Pract. 2004;10:193–203. doi: 10.1097/00124784-200405000-00003. [DOI] [PubMed] [Google Scholar]
- 11.Mays GP, McHugh MC, Shim K, et al. Institutional and economic determinants of public health system performance. American Journal of Public Health. 2006;96:523–531. doi: 10.2105/AJPH.2005.064253. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Lurie N, Wasserman J, Stoto M, et al. Local variation in public health preparedness: lessons from California. Health Aff (Millwood) 2004 doi: 10.1377/hlthaff.w4.341. Suppl Web Exclusives:W4-341-53. [DOI] [PubMed] [Google Scholar]
- 13.Mays GP, Halverson PK, Riley W, Honoré P, Scutchfield FD. Accelerating the Production and Application of Evidence for Public Health System Improvement: the Search for New Frontiers. Frontiers in Public Health Services and Systems Research. 2012;1:1–4. [Google Scholar]
- 14.Scutchfield FD, Mays GP, Lurie N. Applying health services research to public health practice: an emerging priority. Health Serv Res. 2009;44:1775–1787. doi: 10.1111/j.1475-6773.2009.01007.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Green LW. Public health asks of systems science: to advance our evidence-based practice, can you help us get more practice-based evidence? Am J Public Health. 2006;96:406–409. doi: 10.2105/AJPH.2005.066035. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Mays GP, Scutchfield FD, Bhandari MW, Smith SA. Understanding the organization of public health delivery systems: an empirical typology. Milbank Q. 2010;88:81–111. doi: 10.1111/j.1468-0009.2010.00590.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Ingram RC, Scutchfield FD, Mays GP, Bhandari MW. The economic, institutional, and political determinants of public health delivery system structures. Public Health Rep. 2012;127:208–215. doi: 10.1177/003335491212700210. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Institute of Medicine of the National Academy of Sciences. The Future of Public Health. Washington, DC: National Academy Press; 1988. [PMC free article] [PubMed] [Google Scholar]
- 19.Institute of Medicine of the National Academy of Sciences. The Future of the Public's Health in the 21st Century. Washington, DC: National Academies Press; 2002. [Google Scholar]
- 20.Mays GP, Scutchfield FD. Improving public health system performance through multiorganizational partnerships. Prev Chronic Dis. 2010;7:A116. [PMC free article] [PubMed] [Google Scholar]
- 21.Brownson RC, Ballew P, Dieffenderfer B, et al. Evidence-based interventions to promote physical activity: what contributes to dissemination by state health departments. Am J Prev Med. 2007;33:S66–S73. doi: 10.1016/j.amepre.2007.03.011. quiz S4-8. [DOI] [PubMed] [Google Scholar]
- 22.Corso LC, Landrum LB, Lenaway D, Brooks R, Halverson PK. Building a bridge to accreditation--the role of the National Public Health Performance Standards Program. J Public Health Manag Pract. 2007;13:374–377. doi: 10.1097/01.PHH.0000278030.41573.c2. [DOI] [PubMed] [Google Scholar]
- 23.Corso LC, Wiesner PJ, Halverson PK, Brown CK. Using the essential services as a foundation for performance measurement and assessment of local public health systems. J Public Health Manag Pract. 2000;6:1–18. doi: 10.1097/00124784-200006050-00003. [DOI] [PubMed] [Google Scholar]
- 24.National Association of County and City Health Officials. 2010 National Profile of Local Health Departments. Washington, DC: National Association of County and City Health Officials (NACCHO); 2011. [Google Scholar]
- 25.Shah GH, Lovelace K, Mays GP. Diffusion of practice-based research in local public health: what differentiates adopters from nonadopters? J Public Health Manag Pract. 2012;18:529–534. doi: 10.1097/PHH.0b013e3182602e5b. [DOI] [PubMed] [Google Scholar]
- 26.Mays GP, Hogg RA. Expanding delivery system research in public health settings: lessons from practice-based research networks. J Public Health Manag Pract. 2012;18:485–498. doi: 10.1097/PHH.0b013e31825f75c9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Mays GP. Leading improvement through inquiry: practice-based research networks in public health. Leadership in Public Health. 2011;9:1–3. [Google Scholar]
- 28.Westfall JM, Mold J, Fagnan L. Practice-based research--"Blue Highways" on the NIH roadmap. JAMA. 2007;297:403–406. doi: 10.1001/jama.297.4.403. [DOI] [PubMed] [Google Scholar]
- 29.Mold JW, Peterson KA. Primary care practice-based research networks: working at the interface between research and quality improvement. Annals of family medicine. 2005;3(Suppl 1):S12–S20. doi: 10.1370/afm.303. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Thomas P, Griffiths F, Kai J, O'Dwyer A. Networks for research in primary health care. BMJ. 2001;322:588–590. doi: 10.1136/bmj.322.7286.588. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Thomas P, Kai J, O'Dwyer A, Griffiths F. Primary care groups and research networks: opportunities for R&D in context. The British journal of general practice : the journal of the Royal College of General Practitioners. 2000;50:91–92. [PMC free article] [PubMed] [Google Scholar]
- 32.Hogg RA, Mays GP. Developing a composite measure of public health research implementation and translation. Frontiers in Public Health Services and Systems Research. 2013 Forthcoming. [Google Scholar]
- 33.Easley D, Kleinberg J. Networks, Crowds and Markets: Reasoning about a Highly Connected World. New York: Cambridge University Press; 2010. [Google Scholar]
- 34.Valente TW. Social Networks and Health. New York: Cambridge University Press; 2010. [Google Scholar]
- 35.Borgatti SP, Everett MG, Freeman LC. UCINET 6 for Windows: Software for Social Network Analysis. Cambridge, MA: Harvard University and Analytic Technologies; 2002. [Google Scholar]