Skip to main content
. Author manuscript; available in PMC: 2016 Nov 14.
Published in final edited form as: AIDS Educ Prev. 2011 Jun;23(3 Suppl):110–116. doi: 10.1521/aeap.2011.23.3_supp.110

TABLE 1.

ITERATIVE EVALUATION OF A MOBILE HIV COUNSELING AND TESTING PROGRAM

(1) Does mobile testing reach a different population than clinic-based health department testing?
Using culturally similar outreach workers in a mobile program targeting high risk venues reached populations that
 were almost twice as likely to have never tested and significantly more likely to have had unprotected anal or vaginal
 sex since their last test.
 Health Department Testing (n = 1838)
  People of color – 29%
  Never tested – 22%
  Unprotected anal or vaginal sex since last HIV test – 54%
 CBO Testing (n=610)
  People of color – 84% (p < 0.001)
  Never tested – 40% (p < 0.001)
  Unprotected anal or vaginal sex since last HIV test – 72% (p < 0.001)
(2) Does offering incentives increase identification of new positives?
A $10 incentive made the testing program four times more effective in reaching people at risk and identifying HIV
 cases:
 No Incentive – 362 tested (5 HIV positive)
 $10 incentive – 1437 tested (25 HIV positive)
(3) How do alternative testing strategies impact program effectiveness?
Offering only rapid testing made the testing program almost twice as effective as when oral fluid testing was offered in
 combination with other test strategies.
 Oral fluid testing (n=829) – 55% received results
 Rapid testing (n=1470) – 99% received results (p<0.001)
(4) How does interactive HIV computer counseling impact counseling quality, program productivity and evaluation
 capabilities?
CBO staff and health department payers appreciated the impact of the CARE tool:
 Counseling Quality – Minimally trained counseling staff appreciated the CARE tool guiding them through all key
 components of counseling and felt that the program improved their ability to provide longitudinal counseling in the
 field, through bringing up past risk behaviors and counseling plans.
 Program Productivity – With the CARE tool staff did not have to spend time in the office entering data, and so could
 spend more time out in the field reaching clients.
 Timeliness of Evaluation Reports– Staff for the first time appreciated evaluation data because they received it in real
 time and were able to better tailor outreach. Program administrators appreciated the automated evaluation reports
 and for the first time were able to submit their reports on time to the health department and receive prompt payment.