Abstract
Objective
The goal of this research was to evaluate changes over time in the capacity of participants of the CDC/ASPH Institute for HIV Prevention Leadership (Institute), a capacity-building program for HIV prevention program managers in minority-based, community-based organizations, Capacity was defined as the application of new skills and knowledge to participants’ jobs and confidence in using those new skills and knowledge to strategically manage and apply “best practices” to their HIV prevention activities.
Methods
This is a longitudinal study involving measuring scholar capacity at three points in time; pre-Institute, post-Institute, and 6 months’ post-Institute. Only responses from participants who completed all three surveys are included in this final analysis of the data (N = 94).
Results
Results indicate that participants from 3 years of the institute (2002–2004) increased their capacity in HIV prevention programming and strategic planning and management. Significant changes were seen in the frequency and self-efficacy with which participants conduct several HIV prevention programming activities. Participants also reported conducting strategic planning activities at more appropriate intervals and were significantly more confident in conducting these activities.
Conclusion
The Institute has positively and significantly increased the capacity of participants to conduct more effective HIV prevention programs on a national level.
Keywords: capacity-building, evaluation, HIV prevention
The purpose of this article is to describe evaluation measures for a capacity-building program for HIV prevention program managers and discuss how the Institute for HIV Prevention Leadership (Institute) curriculum may have increased community-based organization (CBO) capacity. HIV is an important problem that is disproportionately affecting minority communities, It is estimated that between 1,039,000 and 1,185,000 persons were living with HIV in the United States by the end of 2003.1 The AIDS epidemic is not evenly distributed across the population. From 1999 to 2003, AIDS cases increased among blacks, Hispanics, women, and young adults. Blacks accounted for 50 percent of all HIV/AIDS eases diagnosed in 2003,2 In addition, although Latinos account for 13 percent of the population, they comprise 19 percent of new AIDS cases.3 As well, the number of persons exposed to HIV/AIDS through heterosexual contact increased between 1999 and 2003.2
Understanding the social, cultural, and economic context of HIV infection is critical to developing effective prevention programs, and programs developed at the local level should more appropriately capture the linguistic, cultural, and social norms of a targeted community. For this reason, CBOs—one of the principle frontline providers of HIV/AIDS services and prevention programs since the emergence of the epidemic in 1981—continue to hold special promise for reaching communities at risk for HIV, especially minority communities.4 More than 1,450 CBOs have received funding from the Centers for Disease Control and Prevention (CDC) and have provided more than 3,000 HIV prevention programs in the United States and its territories.5 Community involvement in defining and solving health problems is vital to the success of HIV/AIDS CBOs, especially those that serve racial, ethnic, and /or sexual minority populations.6–8 Thus, building community capacity or enhancing CBOs’ capacity to address HIV/AIDS within communities hardest hit by the epidemic has become a central focus of current HIV prevention activities.9–10
HIV prevention capacity building is defined by the CDC as “a process by which individuals, organizations, and communities develop abilities to enhance and sustain HIV prevention efforts” with the goal of “fostering self-sufficiency and the self-sustaining ability” of CBOs “to improve HIV prevention programs, processes, and outcomes.” CDC capacity-building efforts are currently focused in four areas: strengthening organizational infrastructure; enhancing HIV prevention interventions; mobilizing communities for HIV prevention; and strengthening HIV prevention community planning.9
One project that the CDC currently supports to build capacity of local HIV prevention program managers who work in CBOs is the Institute. In 1998, the CDC funded a needs assessment of the training needs and preferences of HIV prevention program managers working in CBOs, Those data revealed domains of instruction and supported a capacity-building format, whereby trainees would be sent to one national training site several times and be engaged for a 9-month period.11 The Institute was formed on this model and offered its first capacity-building programs to a national cohort of professionals from January 2000 to September 2000. For more on the curriculum development processes and curriculum itself see Richter et al11,12
Methods
Design and measures
This is a longitudinal study involving measuring scholar capacity at three points in time: pre-Institute, post-Institute, and 6 months’ post-Institute. The Scholar Capacity Survey was used to measure scholar and organizational capacity. For the purposes of the Institute’s curriculum, and evaluation, capacity was defined as a set of knowledge and skills necessary for managing, planning, implementing, and evaluating effective HIV prevention programs. Capacity was measured across the following domains of skills building: HIV prevention practice (assessment, the use of health behavioral theories, program implementation, and evaluation), the use of strategic planning and human resources techniques, and the use of processes such as advocacy and social marketing to support HIV prevention practice. The Scholar Capacity Survey also incorporated measures of CBO learning environment and culture, as a supportive environment was found to be a critical to capacity building.13,14 Capacity in using electronic communication and information technology applications, although included in the Institute curriculum, was measured using more traditional learner methods, and results are not included here.
The development of the Scholar Capacity Survey instrument was conducted in a collaborative fashion, in consultation with an advisory body and via an iterative process. The advisory body consisted of Institute staff and public health faculty and included both evaluation and community HIV prevention experts. Together, this body drafted the survey instrument, and then made suggestions for further modifications and recommend changes. These changes were then incorporated into the instrument by Institute staff, and it was sent for review again until all modifications were finally agreed upon. The goal of this iterative process was to link Institute content and capacity-building strategies within an evaluative framework.15 More on the Institute’s overall evaluation framework can be found in Richter et al.12 Sample questions from the Scholar Capacity Survey are listed by domain below and reflect the content of the Institute curriculum.
HIV Prevention Practice
Assessing surveillance data
Assessing attitudes, beliefs, risk, and protective factors of a priority population at the individual, interpersonal, community and environmental levels
Setting program goals that seek to reduce the priority populations’ rates of HIV/AIDS cases
Setting program objectives that directly impact the priority populations’ attitudes, beliefs, and knowledge about HIV, and behavioral and nonbehavioral risk and protective factors
Using behavioral theory in planning HIV prevention programs
Developing interventions at individual, group, community, and institutional/systems levels
Collecting process evaluation data to determine stakeholder involvement, representation of priority population, fidelity to implementation plan, and participant value of an intervention
Collecting outcome data on participant attitudes, beliefs, and knowledge of HIV, and behavioral and nonbehavioral changes to risk and protective factors
Collecting impact evaluation data on HIV infection and quality of life of priority populations
Strategic Planning and Human Resources Management
Reviewing agency’s mission statement
Conducting environmental scan and situational analysis of the organization
Involving key stakeholders in the strategic planning process
Prioritizing a strategic issue and developing an action plan for a strategic issue
Conducting human resources management activities
Processes That Support HIV Prevention Practice
Meeting with elected and nonelected officials to discuss important HIV-related legislation and/or policies
Becoming involved with policy development and policy advocacy
Participating in HIV-related policy task forces
Developing social marketing-based approaches to HIV prevention
The CBO Environment/Learning Culture
Attending trainings
Sharing knowledge and skills learned at trainings with others in the CBO
Seeking advice from experts
Assessing and providing feedback on internal HIV prevention activities and policies
Having formal and informal mechanisms to share new information and get feedback on new ideas
Having the tools and technology to support information exchange
For all questions related to HIV prevention programming and the processes that support HIV prevention practice, scholars were asked to rate the frequency with which they performed certain tasks or actions by selecting a score from 7 (always) to 1 (never). To measure the frequency with which strategic planning activities and human resource management activities were performed, scholars were asked to select the interval (ie, once every 3 months, once a year) that best represented how often they conducted each activity at their CBO. Scholars were also asked to rate their level of confidence in their ability to perform these same tasks and actions in all domains. A 7-point scale was used, ranging from 7 (completely confident) to 1 (not at all confident), To measure organizational learning environment, scholars were asked to rate the frequency with which certain activities occurred within their CBOs. Again, a 7-point scale was used, ranging from 7 (always) to 1 (never). To measure organizational learning culture, scholars were asked to rate their level of agreement ranging from 7 (completely agree) to 1 {completely disagree), with statements about their CBOs’ learning culture.
Scholars completed the first measurement (T1) on the first day of week 1 of the Institute. The post-Institute measurement (T2) was conducted onsite at the Institute at the end of the last day of instruction (N = 105). For the third measurement, T3, these same scholars from the Institute were mailed a copy of the survey with a self-addressed return envelope 6 months’ post-Institute completion. Only responses from scholars who completed all three surveys are included in this final analysis of the data (N = 94), representing an 89.5 percent response rate.
Analysis
The dependent variables in the analyses were (1) frequency of performing a given activity and (2) level of perceived self-efficacy. The independent variable was time (pre-Institute to post-Institute to 6 months’ post-Institute). Most data were reported as scale scores. Frequency and confidence scales were derived for each domain by adding scholars’ responses to all the frequency questions and all the self-efficacy questions separately. Repeated-measures analysis of variance was used to test differences between T1, T2, and T3. When several measurements are taken on the same person over time, the measurements tend to be correlated with each other, especially when these measurements can be thought of as responses to levels of an experimental factor of interest {in this case, the Institute), This correlation is taken into account when using a repeated-measures analysis of variance. Bonferroni multiple comparison method was used to find the difference between two points in time, and allowed us to better see the trend over time. All data were analyzed using SPSS 12.0 for Windows.
Some data were not collected as scale scores (strategic planning frequency and human resources management frequency). Consequently, they are reported descriptively as frequencies in this report and are compared to what Institute developers felt was the ideal time frame in which to perform each task. For most of the activities assessed, the preferred time period in which activities should be conducted is “every six months to once a year.”
Results
Demographics
Table 1 presents the demographic characteristics of the 94 scholars who completed all three measurements of scholar capacity during the years 2002–2004 at the time of their matriculation into the Institute. Most respondents, as well as Institute participants, were female (59.6%) and from a racial/ethnic minority population (78.7%). just over half (52.7%) had pursued educational opportunities beyond a bachelor’s degree, whereas 24.7 percent had earned a bachelor’s degree; the remaining 22.6 percent had a high school diploma or some college experience. On average, the scholars had worked just over 6 years in HIV prevention and had worked at their current CBOs for just under 5 years. However, as indicated by the wide range of responses, others had considerably more experience in both HIV prevention (up to 18 years) and within their current organizational environments (up to 27 years). All scholars were from minority-based organizations (data not shown in the table).
Table 1.
Demographic characteristic | |
---|---|
Gender | |
Male | 56 (59.6) |
Female | 37 (39.4) |
Self-identified as transgender | 1 (1.1) |
Race/ethnic background | |
African American | 43 (45.7) |
Hispanic/Latino | 18(19.1) |
Caucasian (non-Hispanic) | 20(21.3) |
Asian/Pacific Islander | 4 (4.3) |
Native American/American Indian | 1 (1.1) |
Other | 8 (8.6) |
Highest level of education (N = 93) | |
High school diploma | 1(1.1) |
Some college, no degree | 17 (18.3) |
Associate’s degree | 3 (3.2) |
Bachelor’s degree | 23 (24.7) |
Some graduate study, no degree | 12(12.9) |
Master’s degree | 29 (31.2) |
Study beyond master’s degree | 5 (5.4) |
Doctorate | 3 (3.2) |
Years worked in paid HIV prevention position | |
< 1 y/no response | 3 (3.2) |
1–5 y | 43 (45.7) |
>5, but <10 y | 27 (28.7) |
>10 y | 21 (22.3) |
Years worked at current CBO | |
< 1 y/no response | 5 (5.3) |
1–5 y | 65 (69.1) |
>5, but <10 y | 14(14.9) |
>10 y | 10(10.6) |
The values given are number (percentage). CBQ indicates community-based organization.
HIV prevention practice
Table 2 illustrates the changes between T1, T2 and T3 in the frequency of conducting and perceived self-efficacy in conducting tasks associated with HIV prevention programming in the areas of community assessment, writing program goals and objectives, using health promotion theory, developing and implementing interventions, and conducting process, outcome, and impact evaluation. A higher scale score in frequency or confidence in performing a given task indicated that Institute participants were more likely to perform each activity when designing an HIV prevention intervention and are more confident in performing those tasks. Higher scale scores indicated that scholars were more likely to follow evidence-based public health prevention practices in developing HIV prevention interventions.
Table 2.
Scale | Scale range | N | Mean (SD)pre-Institute | Mean (SD)post-Institute | Mean (SD)6 months’ post-Institute | F statistic |
---|---|---|---|---|---|---|
HIV Prevention Practice | ||||||
Community assessment—frequency | 8–56 | 88 | 36.1 (10.6) | 44.0 (8.3) | 46.4 (8.0) | 18.78*†‡ |
Community assessment—confidence | 8–56 | 90 | 41.3(8.8) | 49.5 (5.2) | 49.5 (6.5) | 53.48*† |
Goals and objectives—frequency | 7–49 | 89 | 37.9(7.0) | 40.2 (7.3) | 42.2 (5.8) | 12.03*‡§ |
Goals and objectives—confidence | 7–49 | 91 | 37.2 (6.0) | 43.5 (4.5) | 43.6 (5.3) | 56.40*† |
Theory—frequency | 8–56 | 79 | 28.8(8.1} | 32.4(8.1) | 33.5 (8.3) | 11.67*† |
Theory—confidence | 8–56 | 79 | 31.3(9.8) | 40.6(7.1) | 41.0 (5.6) | 80.65*† |
Developing interventions at different levels—frequency | 4–28 | 91 | 18.5 (4.5) | 20.1 (4.3) | 21.0(4.0) | 12.23*|| |
Developing interventions at different levels—confidence | 4–28 | 90 | 20.1 (4.8) | 24.4 (3.0) | 24.3 (3.1) | 63.15*† |
Implementing interventions at different levels—frequency | 4–28 | 89 | 18.8 (4.3) | 20.6 (4.1) | 20.7 (4.3) | 9.84*† |
Implementing interventions at different levels—confidence | 4–28 | 88 | 20.6 (4.4) | 24.4 (2.8) | 23.8 (3.6) | 37.81*† |
Process evaluation—frequency | 5–35 | 92 | 24.7 (7.0) | 27.8 (6.2) | 28.4(6.1) | 14.36*† |
Process evaluation—confidence | 5–35 | 90 | 25.7 (6.2) | 31.3 (3.6) | 30.9 (4.3) | 51.12*† |
Outcome and impact evaluation—frequency | 7–49 | 90 | 31.1 (8.9) | 35.8 (7.9) | 36.6 (8.2) | 25.78*† |
Outcome and impact evaluation—confidence | 7–49 | 89 | 33.1 (8.4) | 41.7(5.6) | 42.1 (5.8) | 69.55*† |
Processes That Support HIV Prevention Practice | ||||||
Advocacy—frequency | 7–49 | 86 | 26.0 (9.7) | 28.5 (9.9) | 28.9 (8.5) | 4.96¶§ |
Advocacy—confidence | 7–49 | 79 | 34.7 (9.0) | 41.0 (7.3) | 39.5 (7.8) | 22.24*† |
Social marketing—frequency | 6–42 | 81 | 18.0 (10.5) | 22.9 (10.1) | 25.0 (9.9) | 21.05*†‡ |
Social marketing—confidence | 6–42 | 78 | 23.5(10.1) | 32.7 (7.7) | 32.1 (7.2) | 47.13*† |
Strategic Planning and Human Resources Management | ||||||
Strategic planning—confidence | 7–49 | 80 | 33.1 (8.6) | 44.6 (4.6) | 44.4(6.1) | 82.06*† |
Human resources management—confidence | 7–49 | 72 | 36.0 (7.5) | 41.7 (6.3) | 41.0 (7.5) | 20.98*† |
The CBO Environment | ||||||
CBO learning environment | 8–56 | 90 | 38.3 (8.4) | 41.0 (8.7) | 42.4 (7.3) | 10.88*§ |
CBO learning culture | 10–70 | 85 | 55.7 (8.5) | 58.4 (8.9) | 58.8 (8.8) | 7.34*§ |
Overall trend is significant, P ≤ .001
Significant difference between pre and post, P ≤ .001.
Significant difference between post and 6 months’ post, P ≤ .05.
Significant difference between pre and post, P ≤ .05.
Significant difference between pre and post, P ≤ .01.
Overall trend is significant, P ≤ .01
Over the course of the Institute, and continuing 6 months beyond Institute participation, scholars significantly (P ≤ .001) increased the frequency and confidence with which they performed activities related to all domains of HIV prevention programming including community assessment, writing program goals and objectives, using health promotion theory, developing and implementing interventions, and conducting process, outcome, and impact evaluation. For most activities, these changes initially occurred during the Institute and were sustained at both post-Institute measures. Frequency of performing community assessments and goal and objective writing continued to significantly (P ≤ .05) increase during the 6-month period immediately following Institute participation.
For most of the activities related to the planning, implementation, and evaluation of public health prevention programs, at T1, scholars reported scale scores in the upper middle part of the scale range. These translate into scholars performing each activity “sometimes” to “often” when developing a new HIV prevention intervention, Likewise, at T1 scholars’ mean confidence scores for most of these activities fell in the upper middle part of the scale range, and translate to feeling “somewhat confident” in performing these activities. In terms of using public health theories and methods, at T1, scholars’ mean scale scores fell in the middle of the scale range. These mean scale scores translate to using theory only “rarely” to “sometimes” when planning an HIV prevention intervention and feeling “neither confident nor not confident” in using theory.
For all activities, significant change (P ≤ .05 for all measures and P ≤ .001 for 12 of 14 frequency and confidence measures) was seen between T1 and T2 (pre- and post-Institute). By T2 scholars’ mean scale scores had increased, and these increases translate into scholars performing each activity (with the exception of using theory) closer to “often” when developing an HIV prevention intervention and feeling “mostly” confident in doing so, Scholars’ mean scale scores for using public health theory increased as well, moving toward using theory “sometimes” and feeling “somewhat confident” in doing so.
Strategic planning and management
Table 3 details how frequently strategic planning activities were conducted by Institute scholars at T1, T2, and T3. Between T1 and T3, there was considerable movement in how often strategic planning activities were conducted. For the seven activities assessed, the preferred time period in which activities should be conducted is “every six months to once a year.” Across all activities, more scholars reported performing the activity at the preferred time period at T2 than at T1, and for six of the seven activities these results were sustained at T3.
Table 3.
How often do you: | Pre-Institute %
|
Post-Institute %
|
Six-months’ post-Institute %
|
|||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
NR/DK | Not enough | Just right | Too often | NR/DK | Not enough | Just right | Too often | NR/DK | Not enough | Just right | Too often | |
Assess or review your mission statement to determine if you are doing what it says you should do | 14.9 | 276 | 40.4 | 17.0 | 0 | 36.2 | 59.6 | 4.3 | 2.1 | 33.0 | 50.0 | 14.9 |
Conduct environmental scan of threats and opportunities | 16.0 | 41.5 | 24.4 | 18.1 | 3.2 | 33.0 | 58.6 | 5.3 | 2.1 | 30.8 | 53.2 | 13.8 |
Conduct a situational analysis to assess internal strengths and weaknesses | 14.9 | 37.2 | 35.1 | 12.8 | 1.1 | 29.8 | 59.6 | 9.6 | 2.1 | 23.3 | 61.7 | 12.8 |
Involve key stakeholders from outside your agency in the strategic planning process | 16.0 | 43.6 | 33.0 | 7.4 | 1.1 | 38.3 | 47.8 | 12.8 | 5.3 | 37.2 | 39.3 | 18.1 |
Identify and prioritize strategic issues | 17.0 | 14.9 | 39.3 | 28.7 | 2.1 | 21.3 | 58.6 | 18.1 | 3.2 | 17.0 | 56.4 | 23.3 |
Develop or revise action steps to address strategic issues | 17.0 | 19.2 | 40.4 | 23.4 | 2.1 | 24.4 | 56.4 | 17.0 | 4.3 | 14.9 | 56.4 | 24.4 |
Assess progress toward action steps and revise plan appropriately | 17.0 | 21.3 | 28.7 | 33.0 | 3.2 | 24.4 | 47.8 | 24.4 | 3.2 | 12.8 | 52.1 | 31.9 |
NR/DK indicates no response or don’t know response; Not enough, activity never occurred, or occurs once every 2 years or more; Just right, activity occurs every 6 months to once a year; and Too often, activity occurs every month to 3 months.
At T1, 40.4 percent of scholars reported reviewing their mission statement “every six months to once a year.” By T2, 59,6 percent of the scholars reported conducting this activity during the preferred time period. At the start of the Institute, 41.5 percent of scholars “never” or “every two years or more” conducted environmental scans to assess external threats and opportunities. Another 15 percent did not respond to the question, Post-Institute, 58.6 percent of the scholars reported conducting environmental scans “every six months to once a year.” Similarly, 37.2 percent of scholar sat T\ conducted a situational analysis to identify internal threats or weaknesses at the most appropriate time interval. However, at T2, 59.6 percent of scholars reported conducting a situational analysis at the most appropriate time interval.
Before participating in the Institute, 43,6 percent of scholars “never” or only “every two years or more” involved key stakeholders in their strategic planning processes, By T2, 47.8 percent reported involving their key stakeholders on a more regular basis. Scholars’ self-efficacy in performing these strategic planning and management functions increased over the measurement period, with most of the change occurring from pre- to post-Institute. Scholars moved from being less than “somewhat confident” pre-Institute to more than “mostly confident” at both post-Institute and 6 months’ post-Institute (data shown in Table 2).
Table 4 details how frequently human resources management activities are conducted by Institute scholars at T1, T2, and T3 For the seven activities assessed, the preferred time period in which activities should be conducted is “every six months to once a year.” For four of the activities, most scholars reported performing the activity at the preferred time period at T2, and, when compared with T1 had changed the interval at which these activities were performed to better reflect what was taught at the Institute. More scholars were also conducting formal job analyses, formally assessing employee training needs, evaluating employee performance, and reviewing or implementing employee performance reward or incentive programs at the recommended intervals Scholars’ self-efficacy in performing human resources management activities increased over the measurement period, with most of the change occurring between pre- and post-Institute. Scholars moved from being “somewhat confident” pre-Institute to “mostly confident” post-Institute and 6 months’ post-Institute (data shown in Table 2),
Table 4.
How often do you: | Pre-Institute %
|
Post-Institute %
|
Six-months’ post-Institute %
|
|||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
NR/DK | Not enough | Just right | Too often | NR/DK | Not enough | Just right | Too often | NR/DK | Not enough | Just right | Too often | |
Review and assess legal statutes to assure compliance with discrimination in employment laws | 9.6 | 46.8 | 39.4 | 4.3 | 1.1 | 44.7 | 44.7 | 9.6 | 5.3 | 40.4 | 45.7 | 8.5 |
Conduct a forma! job analysis for appropriateness of position descriptions and skills required | 9.6 | 39.4 | 49.0 | 2.1 | 1.1 | 43.7 | 52.1 | 3.2 | 4.3 | 31.9 | 58.5 | 5.3 |
Develop or review applicant screening processes and procedures | 13.8 | 43.7 | 33.0 | 9.6 | 2.1 | 51.1 | 41.5 | 5.3 | 5.3 | 37.2 | 50.0 | 7.4 |
Formally assess employee training needs | 8.5 | 23.3 | 44.7 | 23.3 | 2.1 | 14.9 | 61.7 | 21.2 | 5.3 | 14.9 | 51.0 | 28.7 |
Provide or approve employee training | 9.6 | 9.6 | 27.6 | 53.2 | 2.1 | 8.5 | 41.5 | 47.9 | 3.2 | 8.5 | 33.0 | 55.3 |
Evaluate employee performance | 7.4 | 9.6 | 67.1 | 16.0 | 1.1 | 13.8 | 72.3 | 12.8 | 3.2 | 13.8 | 64.9 | 18.1 |
Review and implement employee performance reward or incentive programs | 9.6 | 36.2 | 59.6 | 4.3 | 3.2 | 26.6 | 65.9 | 4.3 | 5.3 | 27.6 | 54.3 | 12.8 |
NR/DK indicates no response or don’t know response; Mot enough, activity never occurred, Of occurs once every 2 years or more; Just right, activity occurs every 6 months to once a year; and Too often, activity occurs every month to 3 months.
Processes that support HIV prevention practice
Table 2 also illustrates the changes between T1, T2, and T3 in the frequency of and perceived self-efficacy in using advocacy and social marketing techniques as a way to enhance HIV prevention programming efforts and the management of those efforts. As with the HIV prevention programming activities, a higher scale score in frequency and/or confidence in performing a given task may indicate that Institute participants were more likely to perform each activity on a regular basis. Over the course of the Institute, and continuing 6 months beyond Institute participation, scholars significantly (P ≤ .01) increased the frequency and confidence with which they performed activities related to both advocacy and social marketing. The frequency with which scholars performed social marketing activities also significantly (P ≤ .05) increased in the 6 months following their Institute participation.
The CBO environment
Scholars also reported changes in their CBO learning environment and culture over the course of the Institute, The frequency with which scholars and their CBOs’ HIV prevention staff performed activities related to improving the CBO learning environment such as attending trainings, sharing knowledge and skills learned at trainings, seeking advice from experts, and assessing and providing feedback on HIV prevention activities and policies significantly (P ≤ .001) increased from T1 to T2 and T3. In addition, their level of agreement that their CBO presented a positive learning culture significantly increased as well over this time period, The overall trend for the entire T1 to T3 time period was significant (P ≤ .001), with most changes occurring during the course of the Institute (T1–T2).
Discussion
Results of this study indicate that the effect the Institute has on the capacity of participants to conduct more effective HIV prevention programs as well as manage and sustain those programs at the program level is positive and significant. Moreover, Institute participants were more confident in their ability to perform activities related to planning, implementing, and evaluating HIV prevention programs, managing both program and human resources operations, and conducting advocacy and social marketing processes, Scholars also expressed positive changes in their CBO learning environment and learning culture.
That these changes occurred during the 9 months of Institute participation and were sustained (and in some cases improved upon) 6 months afterwards is impressive. Many capacity-building programs fail to evaluate retention of knowledge and skills among participants after they leave the program. The evaluation model of the Institute is unique in that it obtained 6-month post-Institute measures. The Institute maintains an electronic scholar resource room on the Internet as well as a scholar/alumni list-serve and conducts alumni seminars. These systems of support may help to maintain and reinforce improved capacity.
Although the data presented herein are not representative of all persons working in HIV prevention, they suggest that more widespread capacity-building programs for persons working on the planning, implementation, evaluation, and management of HIV prevention programs would be helpful. A limitation of these data is that the response rate varies among scales, ranging from 89 percent to 98 percent. For example, all questions on the theory frequency and confidence scales were answered by 79 of the 94 scholars (89%), It is unclear why persons chose not to answer these questions. However, Institute process evaluation data indicate that faculty covered the stated learning objectives, that the level and intensity of instruction was appropriate, and that the material covered during Institute sessions was learned by scholars.12
Current research findings suggests that self-efficacy 16 is one of the key antecedents of learning transfer within organizations.17–20 High mean scores of self-efficacy provide evidence that scholars were able to transfer the skills learned at the Institute to their jobs in their CBOs. On a national level, these capacity increases may translate into more efficient and effective HIV prevention. Scholars are following evidenced-based practices in public health to a greater degree and are doing more to secure funding, as well as advocate for their programs and CBOs. Future research should investigate transfer of learning to others in the CBO, as this transfer would also contribute to enhancing the capacity of the CBO. At the organizational (CBO) level, these changes have translated to a more positive learning culture that is supportive of active learning processes.
Footnotes
The findings and conclusions in this article are those of the authors and do not necessarily represent the views of the Centers for Disease Control and Prevention.
Contributor Information
Donna L. Richter, Arnold School of Public Health, University of South Carolina, Columbia.
Kim Nichols Dauner, Arnold School of Public Health, University of South Carolina, Columbia.
Lisa L. Lindley, Arnold School of Public Health, University of South Carolina, Columbia.
Belinda M. Reininger, Behavioral Health Sciences, Houston School of Public Health, Regional Campus at Brownsville, University of Texas.
Willie H. Oglesby, Arnold School of Public Health, University of South Carolina, Columbia.
Mary S. Prince, Health Promotion Works, Pawley’s Island, Southern Carolina.
Melva Thompson-Robinson, School of Public Health, University of Nevada, Las Vegas.
Rhondette Jones, Division of HIV/AIDS Prevention, Centers for Disease Control and Prevention, Atlanta, Georgia.
Linda H. Potts, Health Consulting Group, Inc., Atlanta, Georgia.
References
- 1.Glynn M, Rhodes P. Estimated HIV prevalence in the United States at the end of 2003. National HIV Prevention Conference; June 2005; Atlanta. [Accessed June 30,2006.]. Abstract 595. Available at: http://www.cdc.gov/hiv/stats.htm. [Google Scholar]
- 2.Centers for Disease Control and Prevention. [Accessed April 5, 2005.];Cases of HIV infection and AIDS in the United States. 2003 Available at: http://www.cdc.gov/hiv/stats/2003SurveillanceReport.pdf.
- 3.Centers for Disease Control and Prevention. The State of Latinos in HIV Prevention Community Planning. Atlanta: Centers for Disease Control and Prevention; 2002. [Google Scholar]
- 4.Fisher EB, Auslander W, Sussman L, Owens N, Jackson-Thompaon J. Community organization and health promotion in minority neighborhoods. In: Becker DM, Hill DR, Jackson JS, Levine DM, Stillman FA, Weiss SM, editors. Health Behavior Research in Minority Populations. Washington, DC: US Department of Health and Human Services; 1992. [Google Scholar]
- 5.Hanchette CL, Gibbs DA, Gilliam A, Fogarty KJ, Bruhn M. A national, geographic database of CDC-funded HIV prevention services: development challenges and potential applications. Int J Health Geogr. 2005;4:28. doi: 10.1186/1476-072X-4-28. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Valdiserri RO, West GR, Moore M, Darrow WW, Hinman AR. Structuring HIV prevention delivery systems on the basis of social science theory. J Community Health. 1992;17:259–270. doi: 10.1007/BF01324356. [DOI] [PubMed] [Google Scholar]
- 7.Kelly JA, Murphy DA, Sikkema KJ, Kalichman SC. Psychological interventions to prevent [HIV infection are urgently needed: new priorities for behavioral research in the second decade of AIDS. Am Psychol. 1993;48:1023–1034. doi: 10.1037//0003-066x.48.10.1023. [DOI] [PubMed] [Google Scholar]
- 8.Holtgrave DR, Quails NA, Curran JW, Valdiserri RO, Guinan ME, Parra WC. An overview of the effectiveness and efficiency of HIV prevention programs. Public Health Rep. 1995;110:134–146. [PMC free article] [PubMed]
- 9.Centers for Disease Control and Prevention. Evolution of HIV/AIDS prevention programs—United States, 1981–2006. MM WR Morb Mortal Wkly Rep. 2006;55:597–603. [PubMed] [Google Scholar]
- 10.Ramos RL, Ferreira-Pinto JB. A model for capacity-building in AIDS prevention programs. AIDS Educ Prev. 2002;14:196–206. doi: 10.1521/aeap.14.3.196.23891. [DOI] [PubMed] [Google Scholar]
- 11.Richter DL, Prince MS, Potts LH, et al. Assessing the HIV prevention capacity building needs of community based’ organizations. J Public Health Manag Pract. 2000;6:86–97. doi: 10.1097/00124784-200006040-00015. [DOI] [PubMed] [Google Scholar]
- 12.Richter DL, Potts LH, Prince MS, et al. Development of a curriculum to enhance community-based organizations’ capacity for effective HIV prevention programming and management. AIDS Educ Prev. 2006;18:365–377. doi: 10.1521/aeap.2006.18.4.362. [DOI] [PubMed] [Google Scholar]
- 13.Rouiller J, Goldstein I. The relationship between organizational transfer climate and positive transfer of training. Hum Resour Dev Q. 1993;4:377–391. [Google Scholar]
- 14.Tracey J, Tannenbaum S, Kavanagh M. Applying trained skills on the job: the importance of the work environment. J Appt Psychol. 1995;80:239–252. [Google Scholar]
- 15.Kotellos KA, Amon JJ, Githens Benazerga WM. Field experiences: measuring capacity building efforts in HIV/AIDS prevention programmes. AIDS. 1998;12:S109–S117. [PubMed] [Google Scholar]
- 16.Bandura A. Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: Prentice-Hall; 1986. [Google Scholar]
- 17.Axtell C, Maitlis S. Predicting immediate and longer-term transfer of training. Pers Rev. 1997;26:201–213. [Google Scholar]
- 18.Baldwin T, Ford J. Transfer of training: a review and direction for future research. Pers Psychol. 1988;41:63–105. [Google Scholar]
- 19.Cheng E, Ho D. The effects of some attitudinal and organizational factors on transfer outcome. J Manag Psychol. 1998;13:309–317. [Google Scholar]
- 20.Noe R, Schmitt N. The influences of trainee attitudes on training effectiveness: test of a model. Pers Psychol. 1986;39:497–523. [Google Scholar]