Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Sep 25.
Published in final edited form as: J Community Health. 2008 Oct;33(5):285–292. doi: 10.1007/s10900-008-9102-z

Community-based Organizations’ Capacity to Plan, Implement, and Evaluate Success

Robert M Mayberry 1,, Pamela Daniels 2, Tabia Henry Akintobi 3, Elleen M Yancey 4, Jamillah Berry 5, Nicole Clark 6
PMCID: PMC3782989  NIHMSID: NIHMS492367  PMID: 18500451

Abstract

Community-based organizations (CBOs) have the potential to promote and sustain health, prevent disease, and address health disparities, but many lack the capacity to do so. An assessment of the 20 CBOs receiving supplemental grant funding from the Pfizer Foundation Southern HIV/AIDS Prevention Initiative indicated a high level of knowledge for developing goals and objectives (mean score = 3.08 on a scale of 0 (none) to 4 (extensive)) and high self-assessed abilities to conduct six of 20 specific intervention activities, including the development of community relationships and coalitions. Lower knowledge and skill levels were observed for intervention evaluation. While CBOs of this Initiative have established prerequisite abilities, they have self-acknowledged needs for technical assistance to maximize HIV/AIDS prevention capacity.

Keywords: Primary prevention, Community health, HIV/AIDS, Planning, Evaluation

Introduction

Despite the potential of community-based organizations (CBOs) to promote and sustain health, prevent disease, and address health disparities, many CBOs lack the capacity to plan, implement, and evaluate their efforts, limiting the degree to which the effectiveness of interventions can be measured [1]. Increased accountability by funding organizations and an emphasis on evidence-based interventions have heightened the importance of CBOs’ intervention capacity [24]. Systematic data reporting, for example, has become critical to continued program funding. Furthermore, promising prevention programs are increasingly dependent upon a seamless loop of fiscal support and an organization’s ability to provide evidence of programmatic success and articulate lessons learned.

Community-based interventions are a core component of the national Centers for Disease Control and Prevention’s (CDC’s) HIV/AIDS Prevention Initiatives, which focus on strengthening capacities to monitor the HIV/AIDS epidemic and plan, implement, and evaluate its programs [3]. The CDC Initiatives are timely since, aware of the unique factors that shape HIV/AIDS within their communities, CBOs have positioned themselves as an essential component of national, regional, and local efforts to combat the HIV/AIDS epidemic and serve as catalysts for prevention and health promotion activities [57]. Beyond the CDC’s Initiatives, few efforts have been made to assist CBOs in improving their capacity to implement effective and sustainable HIV/AIDS prevention program [1, 8, 9].

The Southern HIV/AIDS Prevention Initiative, funded by Pfizer Foundation, is unique in utilizing a private-academic partnership to simultaneously conduct program evaluation and address capacity needs of CBOs to plan and implement HIV/AIDS prevention programs. The Initiative initially funded, for the 3-year period from 2004 to 2006, 24 previously established CBOs to provide HIV/AIDS education and prevention programs in multicultural, rural and urban communities throughout nine states of the southern region of the United States (Alabama, Florida, Georgia, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee, and Texas). The Morehouse School of Medicine Prevention Research Center (MSM PRC) conducted a cross-site program assessment survey (C-PAS) of the CBOs of the Initiative to determine their knowledge, skills, and abilities related to community intervention development, and their technical needs critical to success. The purpose of the survey was to determine the potential of the existing CBOs to impact the HIV/AIDS epidemic in local communities, as well as collectively for the region. The C-PAS was part of the overall evaluation plan to assess the capacity of the CBOs to successfully plan, implement and evaluate HIV/AIDS prevention efforts.

Methods

The self-administered C-PAS questionnaire was sent by US Postal Service and electronic mail to the executive director and a primary program staff member (e.g., health educator or peer educator) of each of the CBOs in the Southern HIV/AIDS Prevention Initiative (Table 1) to assess organizational and individual knowledge, skills, abilities, and technical needs to develop and evaluate community interventions. The C-PAS questionnaire, to be completed in 15–20 min, was specifically designed to capture the organization’s and key personnel’s self-perceived knowledge and skills relating to key steps in community program development and specific abilities to develop community relationships, educate and train program staff and participants, collect appropriate data, and conduct specific program activities including HIV/AIDS counseling, testing, and referrals.

Table 1.

Community-based organizations of the Pfizer Foundation Southern HIV/AIDS Prevention Initiative

Organization City, State
Aid to Inmate Mothers (AIM) Montgomery, AL
Mobile AIDS Support Services Mobile, AL
Jacksonville Area Sexual Minority Youth Network (JASMYN) Jacksonville, FL
Mother’s Voices South Florida Miami, FL
Rural Women’s Health Project, Inc. Gainesville, FL
Sembrando Flores Homestead, FL
AIDS Resource Council Rome, GA
Health Outreach Project, Inc. Atlanta, GA
Southeast Georgia Communities Project. Lyons, GA
Union Mission, Inc. Savannah, GA
Baton Rouge AIDS Society (BRASS) Baton Rouge, LA
Family Services of Greater Baton Rouge Baton Rouge, LA
Southwest Louisiana AIDS Council Lake Charles, LA
HIV Services, Inc. Vicksburg, MS
South Mississippi AIDS Task Force, Inc. Biloxi, MS
AIDS Care Services, Inc. Winston-Salem, NC
Chatham Social Health Council Pittsboro, NC
Community-Based Learning Alternatives Center, Inc. (CBLAC) Smithfield, NC
Hope Health Florence, SC
Palmetto AIDS Life Support Services Columbia, SC
Methodist Healthcare Foundation—Community HIV Network Chattanooga, TN
International AIDS Empowerment, Inc. El Paso, TX
Mujeres Unidas Contra el SIDA, Inc. San Antonio, TX

The key program staff members were expected to be familiar with some of the issues assessed in C-PAS based on previous experience and the Initiative assessment and training activities conducted in 2004 (year 01). An introduction to the evaluation process and logic model conceptualization occurred during the first annual meeting of the Initiative CBOs prior to the C-PAS. Prior to administration of the survey questionnaire, the MSM PRC evaluation staff also conducted semi-structured teleconference interviews to gain a clearer understanding of initial program implementation, evaluation plans, and the context within which each program functioned. Teleconference interviews were developed to: (1) gain insight into how programs may have evolved in scope, goals and objectives; (2) identify how each CBO defined program success; (3) assess intervention data needs, current data collection methods and procedures; (4) determine how collected data would be used to assess program success; and (5) determine technical assistance needs. Teleconferences also served to develop more structured questionnaire items for C-PAS.

The survey was conducted during the first quarter of the second year of the Initiative, with the initial mailing of the survey questionnaire occurring in February 2005 (year 02). The CBOs that did not return the questionnaire were sent two additional US mail and electronic reminders, approximately 2 weeks apart. In addition, two follow-up phone calls were made as reminders to complete and return the questionnaire. Initial analyses of survey responses indicated no statistically significant differences between executive directors and primary program staff members. The two key program groups were combined in subsequent analysis. Descriptive statistics were used to evaluate each survey response variable [10]. The relationship between knowledge and skills levels and year 01 progress of completing planned strategies and meeting stated objectives were evaluated, with these parameters of success expressed as dichotomized yes/no variables [10]. Continuous variables were expressed as means and standard deviations. Categorical variables were expressed as proportions.

Results

The 23 organizations participating in the 3-year Pfizer Foundation Initiative are listed in Table 1. Thirty-nine of the 46 C-PAS questionnaires mailed were completed and returned (84.8% response rate), representing the 20 small and mid-size organizations (those with annual budgets of less than $1,000,000) in this survey.

Table 2 gives detailed information, as provided by key program staff, on the administrative structure of the programs, community partnerships, population groups served through the Initiative, race/ethnicity and gender of populations served, and types of interventions and services provided to the targeted populations. Approximately ninety percent (89.7%) of all the organizations indicated a strong community-base of volunteers. Collaborations and partnerships were established by the Initiative CBOs with other community-based organizations (85.6%), local clinics and health facilities (69.2%), colleges and universities (56.4%), and local churches (53.8%). The majority of the organizations focused on a specific racial/ethnic minority group solely or in conjunction with services to other ethnic groups (African American/Black (64.1%), Hispanic/Latino (43.6%), Haitian (5.1%) and Asian/Pacific Islanders (2.6%)); 41% of organizations focused on all racial and ethnic groups in their respective local communities.

Table 2.

Characteristics of CBOs of the Pfizer Foundation Southern HIV/AIDS Prevention Initiative

Organizational characteristic (per question posed) Participant responses (N = 39)
Yes
1. Do you have community volunteers? 89.7% (35)
2. Does your project have an established advisory committee? 59.0% (23)
3. Is your major HIV/AIDS intervention solely funded by the Pfizer Foundation? 15.4% (6)
4. Is Pfizer Foundation support supplementing an existing program? 76.9% (30)
5. Have you identified all appropriate resources needed to implement proposed activities? 61.5% (24)
6. Have you completed all planned strategies/activities for year 1? 74.4% (29)
7. Have you identified specific products that will come from your program? 76.9% (30)
8. Have you specified all outcomes anticipated? 87.2% (34)
9. Are you meeting the stated objectives listed in your logic model and reapplication? 87.2% (34)
10. Were you familiar with logic model development before this project? 51.3% (20)
11. Has the logic model development process helped you in developing measurable goals and objectives? 84.6% (33)
12. Do you have a written protocol (plan) for data collection? 51.3% (20)
13. Do you currently have data collection tools? 89.7% (35)
14. Is data stored in a computer? 66.7% (26)
Population served
Incarcerated individuals (prisons, juvenile justice detention center) 28.2% (11)
Staff (such as prisons and school) 25.6% (10)
Church/Religious organizations 23.1% (9)
Youth community organizations 41.0% (16)
Middle schools 38.5% (15)
Gay/Lesbian/Bi-Sexual/Transsexual 20.5% (8)
High schools 41.0% (16)
Adult community organizations 23.1% (9)
Peer educators 69.2% (27)
Substance Abusers 38.5% (15)
Race/Ethnicity
African American/Black 64.1% (25)
Hispanic/Latino 43.6% (17)
Caucasian/White 28.2% (11)
Haitian 5.1% (2)
Asian/Pacific Islander 2.6% (1)
All Racial/Ethnic Groups 41.0% (16)
Gender
Female 17.9% (7)
Male 2.6% (1)
Both (male and female) 79.5% (31)
Interventions/Services
Partner Counseling/Referral Services 28.2% (11)
Outreach 66.7% (26)
Health communication/public information 74.4% (29)
Group-level interventions 71.8% (28)
Counseling and testing 53.8% (21)
Individual-level interventions 53.8% (21)
Prevention case management 30.8% (12)
Other interventions (including community-level) 35.9% (14)
Community Partnerships
Churches 53.8% (21)
Community-based organizations 84.6% (33)
Colleges and universities 56.4% (22)
Local clinics, hospitals/health departments 69.2% (27)
No community partnerships exist 2.6% (1)

Peer educators (69.2%), youth community organizations (41.0%), and high schools (41.0%), were among populations served by the CBOs through Pfizer Foundation support, as were substance abusers (38.5%), middle schools (38.5%), and incarcerated individuals (28.2%). The more common types of interventions or services provided by the Initiative CBOs were health communication and public information (74.4%), group-level interventions (71.8%), outreach (66.7%), and counseling and testing (53.8%). Prevention, case management, partner counseling, and referrals were less commonly provided.

The C-PAS contained very specific questions regarding the year 01 timeline of inputs, activities, outputs, outcomes and objectives of intervention development and infrastructure issues as a segue to questions capturing data to assess the CBOs’ knowledge, skills, and abilities related to program development, implementation and evaluation. The majority of participating CBOs indicated having completed all planned activities for year 01 (74.4%), identified specific products to come from their HIV/AIDS program (79.6%), and specified all anticipated program outcomes (87.2%). Most (87.2%) also indicated that they were meeting stated objectives as presented in their respective logic models.

Having introduced the logic model for planning and evaluation purposes in the Initiative’s first workshop, emphasized logic model development in year 01, conducted technical assistance semi-structured teleconference interviews and required program-specific logic model submission for year 02 continuation grant applications, it was rewarding to note that 83.8% of key program staff found the logic model development process helpful in developing measurable goals and objectives. With regards to infrastructure or technical resources and expertise, 89.7% of key staff indicated having data collection tools. However, only two-thirds (66.7%) of the CBOs indicated that collected data were stored electronically and only about half (51.3%) had a written protocol or plan for data collection.

The key program staff self-assessment of individual knowledge and skill levels to plan, implement, and evaluate community-based programs are presented in Table 3. The highest knowledge mean scores were for the development of goals and objectives (3.08) as measured on a scale of 0 (none) to 4 (extensive). The mean knowledge score for problem identification was 2.92, the second highest self-assessed knowledge level. Mean skill scores observed for problem identification (2.97) and prevention intervention development and implementation (2.87) indicated the program staff felt they had “a lot” of skills to conduct these two activities. Evaluation of community interventions represented the lowest knowledge and skill mean scores (2.41 and 2.36, respectively), indicating only modest, i.e., “some,” knowledge of and skills to conduct evaluation.

Table 3.

Program staff individual knowledge and skill levels regarding community program development

Knowledge and Skill
Activity Knowledge Meana (Standard Deviation) Skill Meana (Standard Deviation)
1. Problem identification 2.92 (0.84) 2.97 (0.96)
2. Needs assessment of local community 2.72 (1.03) 2.44 (1.10)
3. Development of goals and objectives 3.08 (0.93) 2.92 (0.93)
4. Gathering program input from and providing feedback to community participants 2.85 (1.01) 2.72 (0.94)
5. Prevention intervention development and implementation 2.87 (1.06) 2.97 (0.99)
6. Evaluation of community intervention 2.41 (0.97) 2.36 (0.93)
a

On a scale of 0–4; 0 = none, 1 = little, 2 = some, 3 = lot, 4 = extensive

Organizations’ abilities to plan and implement specific community intervention activities ranged from a modest mean score of 3.49 (train public speakers) to a notably high mean score of 4.67 (provide HIV/AIDS referrals) on a scale of 1 (low) to 5 (high) (Table 4). Relatively high self-assessed abilities were observed for six of the 20 activities addressed in C-PAS: development of community relationships and coalitions (4.46), identification of appropriate resources to plan and complete community intervention programs (4.08), conduct interviews (4.08), train peer educators (4.36), and provide HIV/AIDS counseling (4.36) and referrals (4.67). Modest mean scores were observed for the CBOs’ ability to analyze collected data (3.36), develop intervention-specific logic models (3.49), and develop data collection tools (3.51).

Table 4.

Survey responses of program staff regarding the community-based organization’s ability to plan and implement community interventions

Activity Meana (Standard Deviation)
1. Develop community relationships and coalition 4.46 (0.68)
2. Identify appropriate resources (financial, volunteer, etc.) to plan and carry out community intervention programs 4.08 (0.96)
3. Recruit community volunteers to plan and participate in interventions 3.69 (0.80)
4. Develop intervention-specific logic models 3.49 (1.07)
5. Assess the knowledge, behavior, and attitudes of program participants 3.92 (0.93)
6. Develop data collection tools 3.51 (0.91)
7. Conduct interviews 4.08 (0.74)
8. Conduct focus groups 3.97 (0.93)
9. Collect data 3.87 (0.86)
10. Enter collected data into the computer 3.85 (0.93)
11. Analyze collected data 3.36 (1.04)
12. Train peer educators 4.26 (0.97)
13. Train peer mentors 3.74 (1.25)
14. Train public speakers 3.49 (1.07)
15. Develop educational brochures, pamphlets 3.64 (1.09)
16. Provide HIV/AIDS counseling 4.36 (1.01)
17. Provide HIV/AIDS testing 3.77 (1.77)
18. Provide HIV referrals 4.67 (0.90)
19. Develop public service announcements 3.51 (1.34)
20. Develop newsletters 3.87 (1.06)
a

On a scale of 1 (low)-5 (high)

When asked to identify technical assistance needs, key program staff responses were consistent with some of the lower ability scores presented above (Table 5). The vast majority of the CBOs (84.6%) requested technical assistance in evaluation development. More than 60% of CBOs indicated the need for technical assistance in developing data collection tools (61.5%) and in using qualitative and quantitative data methods to measure expected outcomes (64.1%).

Table 5.

Survey responses of program staff regarding the technical assistance needs of community-based organizations

Technical assistance needs Yes responses (n = 39) (%)
Logic model development 48.7
Data management 48.7
Data collection tool development 61.5
Protocol development 53.8
Qualitative/quantitative methods 64.1
Evaluation development 84.6
Website development 2.6
Technical expertise on SPSS 2.6

Higher mean scores for self-assessed knowledge and skills were suggestive of CBOs having completed all planned activities, identified specific products of the intervention, and met stated objectives for year 01. Although there was a general pattern of higher scores for these parameters among CBOs reporting year 01 success, only knowledge and skills in developing goals and objectives were statistically significantly related to having met year 01 objectives [mean score 3.20 (yes) versus 4.00 (no) for knowledge, P <0.001; mean score 3.03 (yes) versus 4.00 (no) for skills, P < 0.001].

Discussion

The Initiative CBOs’ key program staff rated themselves as knowledgeable and skillful in core areas of community interventions such as development of goals and objectives, problem identification, and prevention intervention development and implementation. However, key staff were less knowledgeable about and skillful in evaluation of community interventions and rated their individual competence as only modest in this area. By contrast, key staff rated organizational abilities to develop community relationships and coalitions, identify appropriate resources for community intervention programs, conduct interviews, train peer educators, and provide HIV/AIDS counseling and referrals highly. Identified technical assistance needs were consistent with relatively low self-rated knowledge and ability scores in evaluation, using qualitative and quantitative data methods, and developing data collection tools. Self-rated organizational abilities were suggestive of having met stated goals.

Modest knowledge and skills to conduct evaluation and the need for substantial technical assistance in conducting program evaluations are not surprising findings for these small and mid-size CBOs. Likewise, technical expertise in methods of outcomes measurement; data collection, management, and analyses; and logic model development may be outside of the expected capacity of these organizations. Some of the CBOs have recognized this capacity deficit and addressed it through partnering with academic faculty, usually as paid consultants.

Many CBOs do not consistently practice program evaluation and, by not doing so, limit the degree to which the effectiveness of interventions can be measured [1, 11]. Perhaps evaluation is considered the domain of academicians or other evaluation professionals. Evaluation capacity centers on ownership of evaluation skills through recognition of the utility of evaluation and an organizational culture that incorporates evaluation into all program design and implementation efforts [1219]. While evaluation capacity serves to strengthen a program’s ability to consistently measure success, make programmatic improvements, and provide funding sources with evidence of good fiscal stewardship, it requires buy-in at the local level. In addition to limited available funding, staff capacity, and confidence in evaluation findings, data collection and analysis skills, technical assistance needs, and poor access to technology are also barriers to evaluation practice [20].

The MSM PRC approached logic model development with the Initiative CBOs as a critical conceptual and visual depiction of the program inputs, strategies, outputs, outcomes, and possible impact, the assumed causal relationship between them, and the relationship between all these components and well articulated and measurable objectives. While half of the CBOs were familiar with logic model development before this Initiative, all became intimately acquainted with the concept and process for planning and evaluation purposes through participation. Our interactions with key staff via group and one-on-one discussions suggested the CBOs executive directors were more familiar than the other key staff with logic models, although our survey results did not specifically indicate so. Anecdotally, many of the CBOs gained substantial confidence in articulating concepts and presenting well developed logic models as a means of listing detailed program strategies/activities, linking intervention objectives, providing a detailed guide for seasoned staff as well as newcomers, and sharing their experiences with other local community organizations.

Our assessment of the Initiative CBOs parallels observations of other major community-based Initiatives, most notably the CDC funded Racial and Ethnic Approaches to Community Health (REACH 2010). Observations of REACH indicated a great potential of the 42 CBOs around the country in the earlier stages of the Initiative in such areas as understanding the context, causes, and solutions of health disparities in six targeted health conditions (cardiovascular disease, diabetes mellitus, HIV/AIDS, infant mortality, breast and cervical cancer and immunization); building coalitions; planning and capacity building; and developing community action plans [21, 22]. These early efforts laid the foundational infrastructure whereby changes in communities and systems, change agents, risk behavior and disparities are expected in later years. Perhaps, the most important lesson learned from the REACH 2010 communities in building this infrastructure will be how to help other communities develop interventions to eliminate racial and ethnic disparities. A comprehensive evaluation of community-based participatory research (CBPR), also in its early stages, indicated a great potential for this approach in addressing community-level health issues [23].

The potential of Pfizer Foundation Southern HIV/AIDS Prevention Initiative CBOs are also being realized—as indicated by this assessment. However, there were limitations to this assessment which must be considered in interpreting the findings. Firstly, the number of respondents to the survey was relatively small and precluded robust analyses and limited power to detect statistically significant changes in areas such as relationships between capacity and early (year 01) success in meeting stated goals and objectives. Secondly, this was a self-rating of organizational capacity by key staff members. No formal testing of staff members’ knowledge, skills, or abilities in planning and implementing strategies and intervention activities was conducted. Also, C-PAS was conducted in the first quarter of year 02 of this 3-year project and the MSM PRC was not involved in training activities with the CBOs prior to the assessment. Therefore, the true baseline capacity of the CBOs prior to technical assistance and first year planning of intervention activities was not measured here.

Nevertheless, CBOs, including the Initiative organizations, have become an essential component in better understanding the context of the HIV/AIDS epidemic in local communities and the contextual framework for disease prevention and health promotion activities. CBOs work to identify and change HIV/AIDS attitudes, knowledge and perceptions; build genuine trust among marginalized populations; and assist in the navigation and provision of testing and counseling services [57]. Possessing the capacity of understanding the HIV/AIDS epidemic enhances other capacity building efforts.

The effort of CBOs in this Initiative to impact the HIV/AIDS epidemic in the southern region presents unique, potential obstacles. The epidemic continues to adversely and disproportionately impact communities in the southern region, particularly African-Americans and Hispanics. While the estimated number of new AIDS cases in the nation increased 7.1% between 2000 and 2004, new AIDS cases increased 20.0% percent in the South during this period [24]. The southern region had the highest estimated number of new AIDS cases (more than 46% of the US total) and the highest estimated number of persons living with AIDS (nearly 39% of the US total) in 2004 [24]. African-Americans (6.9%) and Hispanics (8.2%) have experienced higher incidences of new AIDS cases when compared to Whites (5.3%) between 2000 and 2004 [24]. Moreover, compared to other regions, the South has the highest percentage of non-elderly uninsured persons and the highest percentage of persons living below poverty and in rural communities [25]. Given the greater impact of the HIV/AIDS epidemic in the southern region of the United States, CBOs are essential to the frontline battle to conquer the epidemic.

Capacity building for HIV/AIDS prevention, and health promotion programs in general, is centered on building organizational infrastructures to deliver programs in response to specific public health problems [2628]; sustain programs over time through multiple agency partners [29, 30]; and build the capacity of organizations to identify health problems and develop appropriate responses [3134]. In this regard, CBOs of the Pfizer Foundation Southern HIV/AIDS Prevention Initiative have established the prerequisite capacity to lead HIV/AIDS prevention efforts, to address health disparities, and to deliver culturally appropriate interventions in local communities and the greater southern region. Addressing self-acknowledged technical assistance needs—possibly through collaborations with academic and professional partners—is essential to maximize prevention capacity and the likelihood of success.

Contributor Information

Robert M. Mayberry, Email: robermay@baylorhealth.edu, Institute for Health Care Research and Improvement, Baylor Health Care System, 8080 North Central Expressway, Suite 500, Dallas, TX 75206, USA.

Pamela Daniels, Program for Healthcare Effectiveness Research, Morehouse School of Medicine, Atlanta, GA 30310, USA.

Tabia Henry Akintobi, Prevention Research Center, Morehouse School of Medicine, Atlanta, GA 30310, USA.

Elleen M. Yancey, Prevention Research Center, Morehouse School of Medicine, Atlanta, GA 30310, USA

Jamillah Berry, Prevention Research Center, Morehouse School of Medicine, Atlanta, GA 30310, USA.

Nicole Clark, Prevention Research Center, Morehouse School of Medicine, Atlanta, GA 30310, USA.

References

  • 1.Richter DL, Prince MS, Potts LH, et al. Assessing the HIV prevention capacity building needs of community-based organizations. Journal of Public Health Management and Practice. 2000;6(4):86–97. doi: 10.1097/00124784-200006040-00015. [DOI] [PubMed] [Google Scholar]
  • 2.Centers for Disease Control and Prevention. Evaluating DC-Funded Health Department HIV Prevention Programs. Vol. 1. Guidance; Atlanta, GA: 2001. [updated 2001; cited 2007 May 7]; Available from: http://www.cdc.gov/hiv/aboutdhap/perb/hdg.htm#vol1. [Google Scholar]
  • 3.Centers for Disease Control and Prevention. HIV Prevention Strategic Plan Through 2005. 2001 [updated 2001; cited 2007 May 7]; Available from: http://www.cdc.gov/hiv/resources/reports/psp/pdf/prev-strat-plan.pdf.
  • 4.Rugg D, Buehler J, Renaud M, Gilliam A, Heitgerd J, Westover B, Wright-Deaguero L, Bartholow K, Swanson S. Evaluating HIV prevention: A framework for national, state and local levels. American Journal of Evaluation. 1999;20:35–56. [Google Scholar]
  • 5.Somlai AM, Kelly JA, Otto-Salaj L, et al. Current HIV prevention activities for women and gay men among 77 ASOs. Journal of Public Health Management and Practice. 1999;5(5):23–33. doi: 10.1097/00124784-199909000-00006. [DOI] [PubMed] [Google Scholar]
  • 6.Silvestre A, Arrowhead S, Ivery J, Barksdale S. HIV-prevention capacity building in gay, racial and ethnic minority communities in small cities and towns. Health and Social Work. 2002;27(1):61–67. doi: 10.1093/hsw/27.1.61. [DOI] [PubMed] [Google Scholar]
  • 7.Klein SJ, O’Connell DA, Devore BS, Wright LN, Birkhead GS. Building an HIV continuum for inmates: New York state’s Criminal Justice Initiative. AIDS Education and Prevention. 2002;14(B):114–123. doi: 10.1521/aeap.14.7.114.23856. [DOI] [PubMed] [Google Scholar]
  • 8.Cheadle A, Sullivan M, Krieger J, et al. Using a participatory approach to provide assistance to community-based organizations: The Seattle Partners Community Research Center. Health Education and Behavior. 2002;29(3):383–394. doi: 10.1177/109019810202900308. [DOI] [PubMed] [Google Scholar]
  • 9.Napp D, Gibbs D, Jolly D, Westover B, Uhl G. Evaluation barriers and facilitators among community-based HIV prevention programs. AIDS Education and Prevention. 2002;14(3 Suppl A):38–48. doi: 10.1521/aeap.14.4.38.23884. [DOI] [PubMed] [Google Scholar]
  • 10.Fink A. How to analyze survey data. Thousand Oaks, CA: Sage Publications; 1995. [Google Scholar]
  • 11.DiFranceisco W, Kelly JA, Otto-Salaj L, et al. Factors influencing attitudes within AIDS service organizations toward the use of research-based HIV prevention interventions. AIDS Education and Prevention. 1999;11(1):72–86. [PubMed] [Google Scholar]
  • 12.Argyris C, Schon D. Organizational learning: A theory of action perspective. Reading, MA: Addison-Wesley; 1978. [Google Scholar]
  • 13.Atkisson CC, Hargreaves A. A conceptual model for program evaluation in health organizations. In: Schulberg HC, Baker F, editors. Program evaluation in the health fields. New York, NY: Human Sciences Press; 1979. [Google Scholar]
  • 14.Birleson P. Turning child and adolescent mental-health services into learning. Organizations Clinical Child Psychology and Psychiatry. 1999;4:265–274. [Google Scholar]
  • 15.Brazil K. A framework for developing evaluation capacity in health care settings. International Journal of Health Care Quality Assurance Incorporating Leadership in Health Service. 1999;12:vi–xi. doi: 10.1108/13660759910249693. [DOI] [PubMed] [Google Scholar]
  • 16.Love AJ, editor. Developing effective internal evaluation. San Francisco, CA: Jossey-Bass, Inc; 1984. [Google Scholar]
  • 17.Patton MQ. Utilization-focused evaluation: The new century. Thousand Oaks, CA: Sage Publications; 1997. [Google Scholar]
  • 18.Preskill H, Torres RT. The learning dimension of evaluation use. New Directions for Evaluation. 2000;88:25–38. [Google Scholar]
  • 19.Stockdill SH, Baizerman M, Compton DW. Toward a definition of the ECB process: A conversation with the ECB literature. New Directions for Evaluation. 2002;93:1–25. [Google Scholar]
  • 20.Chen HT. Designing and conducting participatory outcome evaluation of community-based organizations’ HIV prevention programs. AIDS Education and Prevention. 2002;14(3 Suppl A):18–26. doi: 10.1521/aeap.14.4.18.23879. [DOI] [PubMed] [Google Scholar]
  • 21.Giles WH, Tucker P, Brown L, et al. Racial and ethnic approaches to community health (REACH 2010): An overview. Ethnicity and Disease. 2004;14(3 Suppl 1):S5–S8. [PubMed] [Google Scholar]
  • 22.Graham GN. REACH 2010: Working together to achieve the goal of eliminating health disparities. Journal of Health Care for the Poor and Underserved. 2006;17(2 Suppl):6–8. doi: 10.1353/hpu.2006.0086. [DOI] [PubMed] [Google Scholar]
  • 23.Viswanathan MAA, Eng E, Gartlehner G, Lohr KN, Griffith D, Rhodes S, Samuel-Hodge C, Maty S, Lux L, Webb L, Sutton SF, Swinson T, Jackman A, Whitener L. Community-based participatory research: Assessing the evidence. Rockville, MD: Agency for Healthcare Research and Quality (Prepared by RTI—University of North Carolina Evidence-based Practice Center under Contract No. 290-02-0016); 2004. Report No.: AHRQ 04-E022-2 Contract No.: Document Number. [PMC free article] [PubMed] [Google Scholar]
  • 24.Centers for Disease Control and Prevention. HIV/AIDS Surveillance Report, 2004. Vol. 16. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention; 2005. [updated 2005; cited 2007 May 7]; Available from: http://www.cdc.gov/hiv/topics/surveillance/resources/reports/2004report/pdf/2004SurveillanceReport.pdf. [Google Scholar]
  • 25.U.S. Census Bureau. Current Population Survey, 2003 Annual Social and Economic Supplement. 2003 [updated 2003; cited 2007 May 25]; Available from: http://www.census.gov/apsd/techdoc/cps/cpsmar03.pdf.
  • 26.Roper W, Baker EJ, Dyal W, Nicola R. Strengthening the public health system. Public Health Reports. 1992;107(6):609–615. [PMC free article] [PubMed] [Google Scholar]
  • 27.Meissner H, Bergner L, Marconi K. Developing cancer control capacity in state and local public health agencies. Public Health Reports. 1992;107:15–23. [PMC free article] [PubMed] [Google Scholar]
  • 28.Schwartz R, Smith C, Speers MA, et al. Capacity building and resource needs of state health agencies to implement community-based cardiovascular disease programs. Journal of Public Health Policy. 1993;14(4):480–494. [PubMed] [Google Scholar]
  • 29.Bracht N, Finnegan JR, Jr, Rissel C, et al. Community ownership and program continuation following a health demonstration project. Health Education Research. 1994;9(2):243–255. doi: 10.1093/her/9.2.243. [DOI] [PubMed] [Google Scholar]
  • 30.Rissel C, Finnegan J, Bracht N. Evaluating quality and sustainability: Issues and insights from the Minnesota Heart Health Program. Health Promotion International. 1995;10(3):199–207. [Google Scholar]
  • 31.Biegel DE. Help seeking and receiving in urban ethnic neighborhoods: Strategies for empowerment. In: Rappaport JSC, Hess R, editors. Studies in empowerment steps towards understanding action. New York: Haworth Press; 1984. [DOI] [PubMed] [Google Scholar]
  • 32.Eng E, Parker E. Measuring community competence in the Mississippi Delta: The interface between program evaluation and empowerment. Health Education Quarterly. 1994;21:199–200. doi: 10.1177/109019819402100206. [DOI] [PubMed] [Google Scholar]
  • 33.Thomas R, Israel B, Steuart G. Cooperative problem solving: The neighborhood self help project. In: Cleary H, Kichen J, Grinso P, editors. Case studies in the practice of health education. Palo Alto, CA: Mayfield; 1984. [Google Scholar]
  • 34.Hawe P, Noort M, King L, Jordens C. Multiplying in health gains: The critical role of capacity-building within health promotion programs. Health Policy. 1997;39:29–42. doi: 10.1016/s0168-8510(96)00847-0. [DOI] [PubMed] [Google Scholar]

RESOURCES