Skip to main content
PLOS One logoLink to PLOS One
. 2020 Aug 13;15(8):e0236984. doi: 10.1371/journal.pone.0236984

Lessons from the evaluation of the South African National Female Condom Programme

Mags Beksinska 1,*, Phumla Nkosi 1, Zonke Mabude 1, Joanne E Mantell 2, Bongiwe Zulu 1, Cecilia Milford 1, Jennifer A Smit 1
Editor: Collins Iwuji3
PMCID: PMC7425948  PMID: 32790677

Abstract

Background

Understanding of the facilitators and challenges to female condom (FC) uptake has been limited due to lack of evaluation of national FC programmes.

Setting

The FC has been an integral component of South Africa’s (SA) HIV prevention programme for 20 years and is the largest government-funded FC programme worldwide.

Methods

The national FC evaluation used a mixed-methods approach and consisted of key informant interviews and a telephone survey in a national sample of public and non-public sites. A sub-sample of sites participated in client and provider interviews, and a self-administered client survey. A review of distribution statistics from South Africa’s District Health Information System was also conducted.

Results

All 256 public-sector and 28 non-public-sector facilities reported having ever distributed FCs. Less than 5% of these facilities reported stock-outs and less than 3% reported they had a supply of expired female condoms. Systems for male condom (MC) and FC distribution were complementary, with similar ordering, delivery and reporting processes. FC promotion by providers (n = 278) varied with regard to FC training, whether attitudes about FCs influenced providers offer of FCs, and how they counselled clients about FCs. Of the 4442 self-administered client surveys in 133 facilities, similar proportions of women (15.4%) and men (15.2%) had ever used FCs. Although FCs were available at almost all sites surveyed, only two-thirds of clients were aware of their availability.

Conclusion

Data highlight the role of providers as gatekeepers to FC access in public and non-public sectors and provide support for further FC programme expansion in SA and globally.

Introduction

The female condom (FC) is an important multipurpose prevention technology (MPT) combining protection against unintended pregnancy, HIV and sexually transmitted infections (STIs) [1], and is the only female-initiated HIV prevention barrier method. Female condoms protect against pregnancy 95% of the time during perfect use, and 79% of the time during typical use [2]. The polyurethane FC1 has demonstrated high efficacy against HIV and STIs, but there are no published studies on the efficacy of newer FC designs in preventing HIV and STIs [3].

Lack of commitment by major donors to support FC programming has meant that FCs have not been an accessible prevention option in many of the countries hardest hit by HIV and unintended pregnancy [4]. A comprehensive analysis of why the female condom has not reached its full potential is attributed to a lack of acceptability in the international policy arena, which has led to a reticence to support its introduction, rather than grounded in potential users’ perspectives [4]. The FC was identified by the Reproductive Health Supplies Coalition as one of several under-used reproductive health technologies having the potential to expand choice in reproductive health and family planning programmes [5]. However, despite increased FC distribution globally [6], distribution remains low relative to male condoms (MCs), accounting for only 0.19% of global condom procurement [5], and this imbalance is reflected in donor commitment [68]. Despite this, there has been significant progress in FC technology, with three new FC brands prequalified by the World Health Organization (WHO)/United Nations Population Fund (UNFPA) since 2012 [9], and others are under development [10].

The South African government launched a formal, three-phase “Female Condom Introduction Programme” in 1998 [11, 12] which focused initially on family planning clinics, promoting the FC as a dual method for preventing pregnancy and disease. With the programme’s geographical expansion, the government complemented the public-sector programme with the donation of free FCs on request to non-governmental organisations (NGOs). In the context of extremely high HIV and unintended pregnancy rates, the South African programme has been scaled up considerably. In 2012, South Africa procured one billion MCs and 11 million FCs, with an aim to ensure the availability of at least one FC distribution site in all of the 254 sub-districts in the country [13]. By 2014 the National Department of Health (NDoH) made FCs available to all public-sector sites, expanded distribution to non-public sites, tertiary institutions and added two new FC products (Cupid and Pleasuremore), thus increasing consumer choices of barrier methods. While MCs had been socially marketed in South Africa for 20 years [14], FCs were not added to the socially marketed “Lovers Plus” brand until 2015. The Lovers Plus FC (FC2) was rebranded as an “inner condom” using the same Lovers Plus packaging for male condoms for public sector distribution. The rebranding stemmed from the concern that marketing the product as the ‘female’ condom may limit appeal to potential male purchasers. However, the socially marketed FC was discontinued a year later due to poor sales.

Distribution targets for FCs and MCs have been set in South Africa’s National Strategic Plans [15, 16]; 25 million FCs were to be distributed yearly by 2016 [15], a goal that was exceeded by 2 million. FCs were available for distribution not only in health facilities but non-traditional venues such as airports, hotels, shebeens (bars), tertiary institutions, mines and correctional facilities. In the 2017–2022 Strategic Plan [16] a target of 40 million FCs was set for FY 2021–2022. Despite these distribution targets, FC uptake has been low–only 7.2% in a population-level survey conducted in 2008 [17], and a recent review of the FC in South Africa found that FC use ranged between 2.9% and 38.7% [18]. Today, South Africa has one of the most robust FC programmes globally.

There is a dearth of comprehensive evaluation data on national FC programmes globally. South Africa, Brazil and India have the largest FC programmes supported by ministries of health, yet no large-scale comprehensive evaluation of these countries’ programmes has been conducted [8]. This lack of information limits our understanding of factors related to FC distribution, commodity availability at service delivery points, uptake and use–and leaves many policy, programmatic and user questions unresolved. To address this gap, we conducted a process evaluation of the South Africa’s National Female Condom Programme to understand what and how FC programming is being implemented and programme strengths and weaknesses so as to provide guidance to the South African government on how to improve FC access and uptake.

Materials and methods

Selection of sites

We selected both public sector and non-public sector sites. The public health sector sampling frame comprised the National STIs Sentinel Surveillance Sites which include approximately 30 sites in each of the nine provinces (n = 270) [19]. The non-public sector sites aimed to include one non-governmental organization (NGO), one tertiary education institution, one social-marketing outlet, and one private-sector site in each province (n = 36). Tertiary institutions were selected because they are sites for national female and male programming and social marketing outlets were targeted due to the South African government’s launch of a socially marketed FC managed by the Society of Family Health (local affiliate of Population Services International). Social marketing outlets refer to retail channels such as stores and petrol stations. Sites were randomly selected where possible from a list of non-public FC-distributing sites. All public and non-public sector sites were asked to participate in a telephone interview.

The on-site assessment sample was selected randomly and proportionally from the STI surveillance sites based on four criteria: (1) location (rural, urban, peri-urban); (2) level of care (community health center, primary health care (PHC) clinic; (3) well-established long-term FC distribution (>5 years) and newer sites (<2 years); and (4) sites distributing different brands of FC products. Between 11–14 sites were selected per province.

Sampling and data collection methods

The national FC evaluation used a convergent parallel design based on a combination of seven discrete data collection methods, primarily quantitative, from diverse groups of participants to obtain a comprehensive understanding of the context of service-delivery challenges, from commodity procurement, distribution and storage to availability and uptake via triangulation to validate findings from these sources (see Table 1) [20]. Qualitative and quantitative data were collected simultaneously, although with some time lags between data collection, and were analysed separately, and then integrated in final analysis [20]. The first level of data collection consisted of key informant interviews, a desk review of national FC distribution statistics, and a telephone survey of public and non-public sector sites that offered the female condom. In the second level, we drew a sub-sample of these public and non-public sector sites to participate in site assessments, provider interviews, client anonymous surveys, and client exit interviews. All data collection was conducted by a cadre of research interviewers trained in quantitative and qualitative methods. Two sub-studies–a descriptive cost study and a cohort of women who were new FC users, and among a sub-set of these women, their male partners–under the FC evaluation initiative were conducted in only one province and are outside the scope of the national evaluation.

Table 1. National female condom evaluation data sources, data type, target populations and sites.

Data Source Data Type Domains for This Analysis Target Population and Target Numbers Site
Key Informant Interview Qualitative National government policies and programmes, procurement, condom distribution, storage, supply chain management, availability in provincial and district service delivery sites, monitoring & evaluation, demand creation for FCs 20–25 policymakers, programme managers, individuals involved in condom social marketing strategies at a district, provincial and national level Country-wide; not site-specific
Telephonic Survey Quantitative Condom procurement, stock-outs, storage and distribution 270 public sector sentinel surveillance sites. All sites
DHIS condom distribution data (compare with 3-month data given on telephonic survey 36 non-public sector sites—one of each of following categories per province: 1. Tertiary education, 2. NGO, 3. Private, 4. Social marketing, 270 public sector sites*
Site Assessment Quantitative Condom distribution, availability of female vs. male condoms, IEC materials and condom models, health education talks, knowledge and attitudes about condoms Up to 150 sites Sub-set of sites
Provider Interview Quantitative FC and MC training and perceived need for more training, whether provider discussed condom use with female clients, whether provider gave condoms to male and female clients, whether provider demonstrated FC use to new users and reasons for not always demonstrating use, frequency of FC and MC education of clients, and FC attitudes. 278 providers Sub-set of sites
Client Exit Interview Quantitative First FC use, source of first FC, reasons for use, whether provider explained how to use FC and adequacy of information, preference for type of condom, whether offered FC by provider or requested by client 427 female clients, 18–49 years, who were current or ex-users of FC Sub-set of sites
Client Anonymous Survey Quantitative FC awareness, knew of FC availability at site, whether offered FC by provider, ever used FC, and if so, whether used with partner, condom preference, reasons never tried using FC 4442 female and male clients Sub-set of sites

*Non-public sector sites did not report to DHIS at time of study.

Because SA has an integrated female and male condom programme, data on MCs were collected in all components of the evaluation, although in less detail.

The client anonymous survey and client exit interview as well as consent form for the latter were translated into all of the 11 South African languages and back-translated by experienced translators to check for accuracy. The client interviews and consents were conducted in participants’ language of choice.

Key informant interviews

Key informant interviews were qualitative in nature and designed to elicit a better understanding of the context of system-level FC issues. We purposively selected policymakers and programme managers as well as individuals involved in social marketing strategies at a district, provincial and national level to ensure representation of a range of views about the FC for qualitative key informant interviews. An initial list of key informants (KIs) was drawn up based on the research team’s knowledge of key role players in the public and non-public healthcare sectors) and discussion with the National Department of Health. Criteria for eligibility was a minimum of one year in current position and we aimed to conduct between 20 and 25 interviews, depending on data saturation and availability of key informants. Interviews were conducted either face-to-face or via telephone. The interviews aimed to identify critical issues in the FC chain, including advocacy, overall programme leadership and coordination, supply and commodity security, provider training, monitoring, and integration with other programmes. Audio-recordings were transcribed.

Review of condom distribution statistics from the District Health Information System (DHIS)

The District Health Information System (DHIS) is a web-based data analytics and information system that tracks health service delivery in the public health sector for South Africa. DHIS data are used for health service planning and monitoring. In this evaluation, we reviewed DHIS condom distribution data for each participating site for the same three months that site assessment data were inspected (February-April 2014).

Telephonic surveys of public and non-public sector sites

In the telephonic survey we collected information over the telephone by conducting an interview with the facility manager or their designee. The questionnaire and consent form was sent in advance to allow time for the data required to be collated. The survey included questions on numbers of male and female condoms distributed, where dispensing to clients takes place, distribution to other sites, availability of Information, Education, and Communication (IEC) materials and condom models, constellation of staff, and staff training on the FC. This information was subsequently verified in the on-site assessment.

Female condom-distributing site assessments

The site assessments were scheduled following completion of the telephonic interview. The quantitative site assessments focused on condom storage, supply and distribution issues (e.g., distribution points, stock-outs, expired stock, sub-distribution to other sites); availability of IEC materials and condom demonstration models; and provider condom education talks to clients.

Provider interviews

We purposively selected a sample of providers, including operational managers and clinicians, for quantitative interviews. The number of interviews per site was based on the total number of staff employed at the facility. If the total staff number was five or more, we interviewed three staff; if less than five, two staff were interviewed. The quantitative interview focused on female condom knowledge and attitudes, counselling practices, and condom dispensing and logistics training. We included 24 items assessing attitudes about the FC, e.g., sexual pleasure, inconvenience, improved prophylaxis, insertion reluctance, and trust [21].

Client anonymous surveys

These anonymous self-administered brief quantitative surveys were completed by women and men and were designed to be completed in one to two minutes. On the morning of the site assessment, research staff introduced the study in waiting areas and invited all clients in this area to participate. In addition, surveys were left at reception as well as on tables and unoccupied chairs. Participants were asked about their awareness of the FC, whether ever offered the FC, and if so, whether they accepted this offer, ever used the FC, and if used, reasons for first use, where they obtained FCs, the counselling received, and their experience using the product. Never FC users were asked about their reasons for non-use.

Client exit interviews

These quantitative interviews were conducted with women aged 18–49. Data from women who participated in these interviews was not linked with data from women who participated in the anonymous surveys. During the visit to the facility, research staff informed clients during the day in the different waiting areas that any ever or current users could participate in an interview at the end of their consultation. The interview duration was about one hour. Data presented here focused on reasons for first FC use, source of receiving first FC, whether FCs were offered by providers or requested by the client, whether provider explained how to use the FC and adequacy of information, and if offered, choice of condom type.

Data analysis

Quantitative data were captured and descriptively analyzed using StataIC v14 (StataCorp, College Station, TX). Pearson’s chi-square or Fisher’s exact test of association were calculated for categorical variables.

Thematic analysis of the key informant interview data was conducted, identifying important and emerging themes. An initial codebook was created collaboratively by team members; and a portion of transcripts were double-coded to test the codebook. Discrepancies in coding and application of codes to the transcripts were resolved in team meetings of coders so as to improve inter-rater coding reliability. The remaining transcripts were also double-coded. The final codebook reflected codes developed both inductively and deductively. NVivo (Version 10, QSR International) was used to facilitate systematic data management.

A unit-cost analysis was conducted at eight sites in order to establish FC program costs; findings from this analysis are reported elsewhere [22].

Ethical considerations

The evaluation was approved by the Human Research Ethics Committee (HREC) at the University of the Witwatersrand (M140428/M140365). Written informed consent was obtained from all participants (except for the self-administered client anonymous survey).

Results

Description of sample

Data were collected between August 2014 and September 2016. Of the 270 sentinel surveillance sites, six declined to participate, and a further eight were excluded due to boundary changes, mergers and closures, resulting in 256 public-sector sites. Twenty-eight non-public-sector sites were included. Although an NGO distributing FCs was identified in each province, not all provinces had all three categories (tertiary education, social marketing, and private sector).

Table 2 presents the sample size for each data collection method by province. The number of telephone surveys and site assessments conducted was similar across the nine provinces. Variation was due to the 14 sites excluded as explained in the previous paragraph and the availability of non-public sector sites in each province. Some of the smaller clinic in rural areas such as the Eastern Cape had few clients that had ever used FCs available for interview. Limpopo had the largest number of providers interviewed and this was because all sites aside from one had a large staff compliment and staff were made available for interview.

Table 2. Sample size by data collection method and province.

Province Phone Survey N = 284 Site Assessment N = 133 Provider Interview N = 278 Client Exit Interview N = 426 Anonymous Survey N = 4442 Key Informant Interviews N = 26
Eastern Cape 30 14 27 11 149 3
Free State 31 13 20 52 436 1
Gauteng 32 16 30 83 780 1
KwaZulu-Natal 34 15 34 63 575 4
Limpopo 33 16 47 59 508 1
Mpumalanga 31 16 26 46 504 1
Northern Cape 31 15 34 24 369 1
North West 32 14 35 36 359 1
Western Cape 30 14 25 55 762 1
National - - - - - 12

Table 3 presents the distribution of the sample for in-depth site assessments in each province according to site location (rural, peri-urban, or urban), facility type (public sector primary health clinic, public sector community health center, or non-public sector site), length of FC distribution, and number of sites distributing more than one type of FC. Provinces differed by proportion of site location (rural/peri-urban/urban) according to the proportion of sites of a similar location in the National Surveillance Site sample. There were fewer Community Health Centres in the National sample, with some rural provinces not having a CHC in the sample or only one. The length of FC distribution was found to be greater than five years in most sites, with some sites commencing the programme in the last few years and fewer in the 2- to 5-year range of distribution. This distribution also reflected the National FC Programme which was implemented in a phased approach. The aim was to include all non-public sector sites participating in the telephone survey in the on-site assessment; however, only 19 of the total 28 participated for a number of reasons. Some private sites were unable to accommodate the research team during the time of the provincial visit or the FC programme was delivered in a way that potential clients were not available for interview on site (condoms delivered to community directly).

Table 3. Distribution of sample of in-depth site assessments in each province by site location, facility type, years of FC distribution, and having more than one FC during site visit.

Province and Number of In-Depth Site Assessment Visits Site Location Facility Type Years of FC Distribution Sites with >1 FC*
Rural Peri-urban Urban PHC CHC Non-public <2 2–5 >5
Eastern Cape (n = 14) 11 2 1 12 1 1 3 1 10 0
Free State (n = 13) 7 3 3 11 0 2 7 0 6 5
Gauteng (n = 16) 0 6 10 10 3 3 2 2 12 12
KwaZulu-Natal (n = 15) 8 3 4 11 1 3 3 1 11 1
Limpopo (n = 16) 11 2 3 12 1 3 5 0 11 11
Mpumalanga (n = 16) 13 1 2 12 2 2 3 2 11 1
Northern Cape (n = 15) 9 4 2 11 2 2 2 1 12 13
North West (n = 14) 8 3 3 10 1 3 2 0 12 7
Western Cape (n = 14) 7 1 6 10 2 2 2 1 11 13

*Refers to site having >1 FC type (brand) on site.

PHC = Public Sector Primary Health Clinic.

CHC = Public Sector Community Health Center.

Themes

Results are categorized into four sections, starting with system issues and then shifting to health care provider and client/user issues: (1) Overview of national government policies and programmes; (2) Commodity procurement, supply chain management. distribution, availability, and monitoring and evaluation in provincial and district service delivery venues; (3) Provider experience in the female condom programme; and (4) Client experiences with the FC. Comparable data are synthesized across data collection methods.

Overview of national government policies and programmes

Key informants provided their perspectives on the national FC programme and many discussed condoms in general, rather than differentiating between FC and MC. They noted that FC and MC programming is integrated in health settings, educational institutions and workplaces. Despite this integration, key informants indicated a lack of harmonisation of policies, especially across health, education and social security sectors.

The condom programme is funded primarily by the national SA government, with some additional support from international funders. Despite annual increases in FC budgets, the programme was thought to be challenged by insufficient funding, largely because FC demand is higher than allocated budgets. FC and MC programming policies are integrated. There are varying degrees of national involvement in developing national policy for condom programming, some of which is done collaboratively with partners such as PEPFAR, USAID and UNFPA. The National Department of Health also conducts policy reviews, disseminated policies and monitored implementation of these policies.

Technical task teams, which include provincial and NGO representatives, have been established to review existing policies, especially regarding issues of access and sustainability, and set programming and research agendas. Condom programming policies are integrated into diverse health programmes, e.g., family planning, voluntary medical male circumcision, HIV counselling and testing as well as in other sectors, e.g., Department of Education. Research data from end users are used to inform policymaking about condom programming policies.

Condom distribution targets are set at national level, and these are divided into provincial and district targets. Population-distribution statistics and logistics-management systems are used to determine quantities of condoms. The National Condom Distribution Plan 2013–2016 provides managers with guidance on the implementation and evaluation of condom distribution in their areas [23]. Monitoring and evaluation (M&E) systems are well-established, with FC distribution added to the DHIS reporting requirements in 2013.

Commodity procurement, storage, supply chain management. distribution, availability, and monitoring and evaluation in provincial and district service delivery venues

The Logistics Management Information System, along with DHIS and M&E data, are used to estimate condom procurement and distribution needs, but key informants still identified challenges. Procurement and storage systems for FCs and MCs were noted to be similar in all public-sector sites. Both types of condoms were collected or delivered within a week of ordering by 74.6% of sites, except for a few locations in rural areas, where it took up to six months. Three-quarters (77.8%) of sites had a store-room/dispensary for FCs and MCs, but a third (31.6%) noted that correct storage was a challenge, primarily because the condom boxes were kept directly on damp ground rather than in a dry environment without sunlight. Also, key informants indicated that storage capacity was limited.

Key informants noted various issues with the supply chain management process led to unavailability of stock (either complete absence or limited availability). These included delays from international manufacturers, inability of suppliers to meet demand for condoms, suppliers’ unwillingness to register on the procurement database, monitoring condom commodities (e.g., exposure to sun, quality) from suppliers, costs of condoms due to fluctuations in exchange rates, expiration of supplier contracts, delayed payment to suppliers, withdrawal of suppliers from the contract after the award had been made. The majority of key informant indicated they had experienced FC stock-outs.

Telephone survey and site assessment data indicated that all 284 sites had ever distributed FCs, half (53.8%) for more than five years, and 18.7% commenced distribution within the last two years. Only a small proportion of sites (2.8%) had stock expire in the last year. Fourteen (4.9%) sites reported stock-outs for a range of reasons. These included a depleted FC supply (n = 7), late ordering of FCs (n = 2), no demand for FCs, rumours that FCs were not being used for what they were intended, so staff did not re-order (n = 2); and one site identified itself as a non-designated FC distribution site.

FC distribution between sites in one three-month period during the evaluation varied widely—from no units to more than 200,000 units per month. In nearly three-fifths (57%) of the 114 public-sector sites participating in the on-site assessment, there was no agreement among the three data sources (telephone survey, site visit and DHIS) in at least one of the three months. Reasons for the discrepancies were mainly unknown or were assumed to be due to missing records.

Overall, a quarter (25.3%) of public-sector sites reported sub-distribution to other sites (NGOs, garages, taverns, brothels and taxi ranks) compared to 66% of NGOs that distributed FCs directly from their sites. Condom distribution in non-traditional outlets was seen as a way to increase access to FCs and decrease the possibility of stigma linked to accessing them in healthcare settings; however, often there was no trained person to support clients’ FC use in these outlets.

For the female condom, it’s a little bit different could because you can’t just put a you know, a box of female condoms down a shebeen. […] we’ve gotten instructions from NDOH to not do that.

(NGO-Gauteng Province)

Key informants also described partnerships between DoH and NGOs, where NGOs also provided distribution support, especially distributing condoms to hard-to-reach areas in rural communities as well as to key populations such as sex workers, men who have sex with men and youth.

Availability of the MC to clients was higher than that of the FC at all site distribution points aside from female toilets where similar percentages were reported (Table 4). FC leaflet availability was higher than that of MC, as FC instruction leaflets are provided by manufacturers of each brand, though limited to English. Availability of FC and MC condom IEC materials was similar but availability of demonstration models for MCs was higher than that for FC.

Table 4. Condom, IEC materials and demonstration model availability by distribution area.

FC, N = 284 MC, N = 284
Distribution points % N % N p-value
 At least one consulting room 79.2 (225) 86.6 (246) 0.026
 Waiting area 65.1 (185) 76.4 (217) 0.004
 Female Toilet 34.5 (93) 33.8 (96) 0.86
 Male toilet 23.8 (64) 40.5 (115) <0.001
 Corridor 9.8 (28) 11.6 (33) 0.59
 Outside site (wall/gate) 19.0 (54) 29.9 (85) 0.003
IEC (leaflets/posters)
Waiting area-leaflets 29.9 (85) 10.9 (31) <0.001
Consultation room-leaflets 27.1 (77) 13.0 (37) <0.001
Waiting room posters 27.1 (77) 13.7 (39) <0.001
Consultation room posters 10.9 (31) 3.9 (11) 0.002
Demonstration model (dildo or vaginal model)
Waiting area 13.3 (38) 40.1 (114) <0.001
Consultation rooms 22.2 (63) 77.8 (221) <0.001

Key informants at all levels described the process of M&E which begins when condoms are procured, continues when condoms are delivered to primary distribution sites (and from primary to secondary sites), and includes the reporting stage. M&E of female and male condoms was reported to face several shortcomings—incorrect reporting of distribution data, missing deadlines for reporting, lack of dedicated condom logistics staff at district level, lack of reporting systems for non-public sector sites.

Demand creation for female condoms

Key informants described that communications strategies for FCs were being developed, and that some of these were integrated into HIV prevention and reproductive health campaigns as well as campaigns for TB prevention, breastfeeding, and voluntary medical male circumcision. Awareness campaigns and road shows were the most frequently note methods of promoting condoms in the communities. Often health calendar events, e.g., National STI week and Women’s Day, local electronic and print media, booths set up in shopping centres, street-based marketing, as well as door-to-door at people’s homes, tertiary institutions and workplaces are used to promote FCs, with NGOs playing a vital role in this initiative.

Key informants suggested a number of other strategies to improve marketing of the FC: (1) more advocacy; (2) use of social media such as Facebook and Twitter; (3) tailoring strategies for rural and urban populations; and (4) targeting key populations, partners and older people who influence decision-making.

It’s where the aunties [are] so, I know that sometimes we spend time targeting the end user. We should be targeting the influencer- […] -to the same extent that we target the end user, and also the males themselves.

(NGO-Gauteng Province)

Harnessing the power of popular high-profile influencers was also seen as a potentially effective strategy for promoting FCs.

It needs to start from up, right on top in, in, in government you know, filtering down to maybe people that, that identify with certain uhm, maybe actors or you know, celebrities, sport celebrities that can promote female condoms.

(District DoH KwaZulu-Natal Province)

Messaging centred on pleasure, women and empowerment, rather than on HIV prevention which often is stigmatised, was seen as fundamental. Many key informants described the role of men as dominant and decision-makers due to culture, tradition and women’s economic dependence on men, although this is changing.

But the main issue I think it’s uh, social and also cultural. Social in terms of a lot of women don’t work and it’s so difficult for them to force their partners who are breadwinners to use condoms. [….] And also in terms of cultural issues of the submissiveness […] of women.

(NDoH-Gauteng Province)

[O] riginally they, men would never accept especially with, when somebody is married. […] would never, what you call, your husband would never accept a married woman to ha, come with a condom and to, to say now you cannot sleep with me without a condom but at least now I think things are coming, are becoming better because of the awarenesses that […] are there.

(Provincial DoH-Eastern Cape Province)

To improve female condom programming at the provincial and district level, key informants suggested the need for dedicated transport for condom distribution, availability of cheaper FCs, more staff allocated to the condom programme, and peer educators to distribute condoms and provide education about the FC.

constraints that limit our female condom programme. […] current distribution technique and the fact that you know we don’t have anybody responsible […] solely for, for like the logistics management of condom distribution.

(District DoH-KwaZulu-Natal Province)

Provider experience in the female condom programme

Of the 278 providers interviewed, 75% were nurses, and the remainder were health promotors, peer educators, community health workers and a wide range of individuals in specialist roles including a pharmacy assistant, social worker, practice manager and student counsellor (Table 5).

Table 5. Provider job roles.

Job role N = 278, N %
Registered nurse 59.0 (164)
Enrolled nurse 6.5 (18)
Enrolled assistant 7.2 (20)
Lay counsellor 13.7 (38)
Peer educator 4.3 (12)
Health promoter 2.9 (8)
Community Health Worker 1.8 (5)
Project Coordinator 1.4(4)
Volunteer 0.4 (1)
Courtesy Manager 0.4 (1)
HCT Mentor/condom champion 0.4 (1)
Intervention Facilitator 0.4 (1)
Pharmacist assistant 0.4 (1)
Social worker / STI / HIV 0.4 (1)
Practice Manager 0.4 (1)
Student counsellor/Support officer 0.4 (1)
Nursing Assistant 0.4 (1)

Almost all (89%) of the providers were women. More providers were trained and counselled on MCs compared to FCs (Table 6). Ever trained was associated with having counselled female clients on the FC in the last month (X2 = 6.78, p = 0.0009). The three most common reasons for not always demonstrating condom use (FC or MC) were that clients could “read the instructions”, lack of time during the consultation and lack of demonstration models.

Table 6. Provider training, counselling and distribution practices by condom type.

FC % (n = 278) MC % (n = 278)
Ever trained 65.5 (182) 79.1 (220)
Requires more training 82.0 (214) 69.4 (186)
Provider discussed condom use with female clients in one-to-one session in last month
Never 11.4 (31) 3.7 (37)
Less than half 11.4 (31) 5.9 (16)
Half the time 6.5 (18) 3.7 (10)
More than half 7.8 (21) 7.0 (19)
Almost all 39.5 (107) 61.0(166)
Depended on the client* 23.2 (63) 18.8 (51)
Provider personally gave condoms to clients in last week
Female client 47.4 (129) 70.0 (191)
Male client 29.6 (83) 79.6 (218)
Provider demonstrates FC use to new users
Always 73.0 (203) 80.5 (224)
Sometimes 12.6 (35) 10.4 (29)
Not unless asked by client 5.4 (15) 4.7 (13)
Never 8.9 (25) 2.5 (7)
Reason for not always demonstrating condoms N = 73 N = 49
No demonstration model 24.0 (18) 10.2 (5)
No time to demonstrate 20.0 (15) 22.4 (11)
Client can read the instructions 17.3 (13) 40.8 (20)
Not trained 14.6 (11) 0 (0)
Don’t normally counsel on condoms or refer 13.3 (10) 4.0 (2)
Clients know how to use/no need to tell them 4.0 (3) 22.4 (11)
Clients don’t like them/not interested 6.6 (5) 0 (0)
Other 10.7 (8) 0 (0)

*This was a verbatim response that may reflect providers’ reluctance to discuss condoms with particular types of clients, e.g., older or married women.

When the FC was available in every consultation room, half (51.2%) of providers indicated they had given FCs to clients in the past week, while less than a fifth (18.4%) only did so when FCs were available in some or none of the consultation rooms; this difference was significant (X2 = 8.04, p = 0.005). Cadre of staff was the strongest correlate of FC distribution to men, with 43.7% of lay counsellors/peer educators reporting FC distribution in the last week compared with 25.9% of nurses (X2 = 7.8, p = 0.005).

The frequency of educating clients about the MC and FC at the sites was similar because the talks usually included both condoms. The number of talks carried out per site varied considerably, with 10.2% of sites reporting no condom education in the last three months and 42.0% reporting at least daily condom education talks to clients.

As shown in Table 7, most providers held positive attitudes about the ability of FCs to provide better protection against both pregnancy (76.2%) and STIs/HIV (74.7%) compared to the MC. However, some providers expressed unfavourable attitudes toward the product itself, with nearly two-fifths (37.7%) strongly or somewhat agreeing that FCs were weird and some strongly or somewhat agreeing they were unappealing since part of the FC hung outside of the vagina (32.7%) or were messy (15.1%). Up to 25% of providers indicated “don’t know” to many of the attitudinal statements.

Table 7. Providers attitudes about the female condom.

Item Description Strongly Agree, N = 278, % N Somewhat Agree, N = 278, % N Somewhat Disagree, N = 278, % N Strongly Disagree, N = 278, % N Don’t Know, N = 278, % N
Female condoms make sex better for women. 51.8 (144) 18.7 (52) 6.5 (18) 7.2 (20) 15.8 (44)
Female condoms feel more natural than regular male condoms. 47.1 (131) 19.1 (53) 10.4 (29) 7.6 (21) 15.8 (44)
Female condoms make sex last long. 35.3 (98) 18.7 (52) 12.2 (34) 8.6 (24) 25.2 (70)
Female condoms are better than male condoms. 47.8 (133) 16.9 (47) 11.5 (32) 11.9 (33) 11.9 (33)
Female condoms are weird. 24.1 (67) 13.7 (38) 21.9 (61) 36.3 (101) 4.0 (11)
Female condoms are inconvenient. 15.5 (43) 12.6 (35) 20.9 (58) 48.2 (134) 2.9 (8)
Female condoms are messy. 10.1 (28) 5.0 (14) 20.5 (57) 58.3 (162) 6.1 (17)
Having part of the female condom outside the vagina is unappealing/turn-off. 17.6 (49) 15.1 (42) 19.4 (54) 41.4 (115) 6.5 (18)
Female condoms offer better protection against unintended pregnancy than male condoms do. 65.1 (181) 11.6 (31) 10.1 (28) 11.5 (32) 2.2 (6)
Female condoms offer better protection against sexually transmitted diseases than male condoms do. 62.6 (174) 12.2 (34) 13.0 (36) 1.04 (29) 1.8 (5)
Female condoms are stronger than male condoms. 65.1 (181) 10.8 (30) 6.1 (17) 10.4 (29) 7.6 (21)
The female condom takes too long to put in. 20.5 (57) 13.7 (38) 19.8 (55) 39.6 (110) 6.5 (18)
It is hard to carry female condoms in a purse because of their size. 15.8 (44) 10.8 (30) 18.4 (51) 53.2 (148) 1.8 (5)
Female condoms put the woman in charge. 74.8 (208) 10.8 (30) 2.9 (8) 8.3 (23) 3.2 (9)
The female condom provides women another contraceptive choice. 94.6 (263) 3.6 (10) 1.1 (3) 0 (0) 0.7 (2)
The female condom provides women another choice to protect themselves against HIV and other sexually transmitted diseases. 97.8 (272) 2.2 (6) 0 (0) 0 (0) 0 (0)
Sex doesn’t feel as good when you use a female condom. 5.4 (15) 5.8 (16) 20.1 (56) 48.6 (135) 20.1 (56)
Female condoms make it hard for a woman to have an orgasm (cum). 2.9 (8) 3.6 (10) 22.3 (62) 49.6 (138) 21.6 (60)
Female condoms make it hard for a man to have an orgasm (cum). 1.4 (4) 3.6 (10) 23.7 (66) 50.7 (141) 20.5 (57)
Female condoms take all the fun out of sex. 2.9 (8) 5.4 (15) 22.7 (63) 58.3 (162) 10.8 (30)
You don’t like the idea or thought of putting the female condom inside yourself. 13.3 (36) 9.2 (25) 17.3 (47) 53.5 (145) 6.6 (18)
You don’t like the idea or thought of having to touch yourself to put the female condom in. 13.3 (36) 5.5 (15) 19.6 (53) 56.1 (152) 5.5 (15)
You don’t like the idea or thought of having to use your finger to put the female condom in. 10.3 (28) 6.6 (18) 19.2 (52) 58.7 (159) 5.2 (14)
If a woman wants to use a female condom, her partner might think she was having sex with someone else. 16.9 (47) 16.2 (45) 15.8 (44) 47.8 (133) 3.2 (9)

Providers’ technical knowledge of FCs differed. Most (84.5%) were aware that the FC should not be reused and of correct removal technique (84.5%); however, there was some confusion about lubricant use with FCs, with 19% believing that any type of lubricant could be used. Ninety percent of providers reported that clients were informed verbally about the availability of FCs and MCs, and most indicated a lack of availability of IEC materials. Almost all providers (96%) in sites that had experience with more than one FC product thought it was important to increase choice of FCs for clients.

Client experiences with the FC

Of the 4442 anonymous client surveys, the majority were completed by women (87%). The age range was 18–57 years and the mean age of women (29.3 years, SD = 7.8) and men (29.9 years, SD = 8.2) was the same. Similar proportions of women (84.5%) and men (78.5%) had ever heard of FCs but fewer had ever used them (Table 8). There were no differences between male and female clients who reported ever FC use. However, among clients with current partners, women were about twice as likely to indicate they always used a FC with this partner. Ever use varied widely between the provinces (Fig 1) and was lowest (5.5%) in women under 20 years.

Table 8. Clients’ knowledge and usage of FC in South Africa.

Knowledge and use of FC Women (n = 3821)a % n Men (n = 577)a % n p-value
Ever heard of FC 84.5 (3148) 78.5 (453) 0.028
Knew of FC availability at siteb 72.8 (2293) 63.3 (287) <0.001
Ever offered by any providerc 52.0 (1489) 35.9 (139) <0.001
Ever used FC 18.6 (587) 19.4 (88) 0.74
If ever used, use with current partnerd N = 522 N = 83 0.023
 Never 10.5 (55) 13.3 (11)
 Sometimes 45.4 (237) 54.2 (45)
 Often 19.2 (100) 10.8 (9)
 Always 24.9 (130) 12.0 (10)

a A small number (<1.0%) of the 4442 survey participants did not respond to the “male/female” question.

b Asked only of those who had ever heard of the FC.

c Asked only of those who had heard of FC and who knew it was available at the site.

d Asked only of those who responded that they had a current partner.

Fig 1. ‘Ever heard of’ FC, ‘ever offered’ FC and ‘ever used’ FC, by province.

Fig 1

EC = Eastern Cape, FS = Free State, GP = Gauteng Province, LP = Limpopo, MP = Mpumalanga, NC = Northern Cape, MW = North West, WC = Western Cape.

Table 8 shows client knowledge of FCs. Women were more likely than men to know that FCs were available at the clinic (X2 = 19.6, p<0.001) and to have been offered a FC by a provider (X2 = 35.1, p<0.001). Approximately half of female clients but fewer male clients had ever been offered a FC by a provider. Although FCs were in stock in almost all sites evaluated, not all clients were aware of this.

Men and women who had never tried using the FC gave the same main reasons, including using another contraceptive, being frightened to try it, and not knowing were to get them (Fig 2).

Fig 2. Female and male clients’ reasons for not using FCs.

Fig 2

Four hundred and twenty-seven women, all of whom were current or ex-users of FCs, completed an exit interview. The mean age was 31.5 years (range 18–49, SD = 7.5 years), with only 2.8% under 20 years. Slightly more than two-fifths (42%) were HIV-positive and one-fifth (20.8%) reported having a STI in the last year. Main reasons for first use of FC, where FC first obtained and other experiences in the programme are shown in Table 9.

Table 9. Female clients’ experiences with FC use.

Experience % (n = 427)
Main reasons for first FC use
 to protect against HIV/STIs 39.4 (166)
 to protect against pregnancy 40.9 (172)
 just wanted to try one 28.0 (117)
Source of obtaining first FC
 Providers 76.3 (326)
 Condom dispenser 17.8 (76)
 Friend 3.3 (14)
 Partner 2.1 (9)
 Can’t remember 0.5 (2)
Mode of receiving the first FC if received from a provider (n = 326)
 Offered by provider 65.6 (214)
 Requested by client 30.9 (101)
 Could not remember 2.8 (9)
Provider explained how to use first FC 98.4(420)
Had enough information given for first FC use 74.9 (316)
If offered choice what condom would you use
 Choose the FC 67.5 (288)
 Choose the MC 20.8 (89)
 Like both FC/MC equally 10.3 (44)
 Don’t like either MC/FC 0.7 (3)
 Not sure 0.7 (3)

*Question not asked of those who obtained from dispenser or other source.

Of the current FC users, 73.4% reported using male condoms more often than before they started using FCs. Almost all (98.4%) of female clients reported that when they first received a FC from a provider, the provider had explained to how to use them. Most clients reported they were shown a FC, but only a third (32.8%) were given a demonstration of insertion on a model. Two-thirds of women (67.2%) received advice from providers on how to introduce the FC into their relationship. Almost all (96%, n = 51) of clients interviewed in sites which had more than one FC product available gave positive feedback regarding availability of a wider selection of FC products.

Discussion

The SA FC programme is well-established and embedded in the healthcare system; systems for MC and FC exist for ordering, reporting, provider training and health education talks. In the early years of the SA programme, FCs were primarily distributed from provider consulting rooms to ensure that new users were given counselling on use and because of concerns about limited stock [13]. This mode of distribution is now shifting as FCs are also being offered in non-clinic settings.

Data highlight the role of providers as gatekeepers to FC access in public and non-public sectors; however, many clients were unaware of FC availability in their facilities. Providers held positive attitudes about the FC, e.g., believing that it provided better protection against unintended pregnancy and HIV/STIs than male condoms, was stronger than the male condom, and made sex better for women. However, they disliked some product features, e.g., its messiness and hanging outside of the vagina. It is not surprising that 15% of the providers were unaware that FCs could not be re-used. At the time of this evaluation, FC1, which was made of polyurethane, was replaced by FC2, which is made of synthetic latex. Research on the polyurethane FC1 indicated that it could be reused several times after washing and drying, but this was no longer possible with FC2 which is made of synthetic latex [24]. A small number of providers who participated in this evaluation may not have realized the difference between the two FCs in terms of re-use.

Provider training was linked to counselling and distribution and a high proportion of providers expressed a need for more training, perhaps linked to the increased availability of FCs at sites since 2014. A need for more provider FC training has been noted in other studies in the region [25, 26].

FC users under the age of 20 years were poorly represented in the FC evaluation in both the anonymous survey and client interviews, in agreement with national surveys where users in the 15–19 age group reported the lowest percentage of ever FC use [2729]. This points to a need to focus on FC promotion to younger clients. Client reasons for not using FCs were similar across all data-collection methodologies. A high proportion of FC users were HIV-positive, indicating successful promotion to this group of women.

South Africa’s FC programme is the only example of a fully integrated national MC/FC condom programme globally that does not focus on a target risk group, and a rare example of a programme primarily funded by the government [1113]. FC evaluations in other countries have focused on specific target groups or geographic locations [3032], where unsuccessful programmes were found to have introduced the FC in an uncoordinated fashion without programmatic support [31]. A review of FC programmes globally has called for routine and intensive monitoring to inform outcome data better [7]. The South African programme has adopted this recommendation and commenced monitoring of FC distribution as part of the DHIS in 2013.

Despite availability, distribution levels of the FC relative to the MC are still low in South Africa, and this is reflective of FC uptake worldwide [45], and is attributed to higher cost, lack of male acceptance, and difficulties in use [4, 3335]. The literature stresses that female-initiated male involvement is key for successful programming [33].

The results of this evaluation indicate that the condom programme has fulfilled all the recommended steps of the UNFPA 10-Step Strategic Approach to scale up Comprehensive Condom Programming (CCP) [36]. This ten-step guide includes “the need to develop a comprehensive and integrated national strategy for male and female condoms”. The male and female condom programme are integrated in distribution mechanisms, both condoms are required to be available at all public-sector health facilities, and South Africa has a National Condom Distribution Plan [23]. Another key step is to monitor programme implementation and conduct research. All facilities have distribution targets and are required to report numbers of male and female condoms distributed. Evaluation findings presented here have been used to inform the National Department of Health in South Africa. Two of the steps are related to budget, including the development of a multi-year operational budget and mobilizing financial resources. Both of these steps have been completed and a budget has been set aside for condom procurement each year and a target number required for procurement; this is noted in the HIV and AIDS and STI National Strategic Plan for South Africa, 2017–2022 [16]. Some of UNFPA’s recommended steps could be strengthened in South Africa, including FC promotion, advocacy and engaging the media. The key recommendations for the SA and other FC programs include [37].

We suggest that the South African and other FC programmes include.

  • Trained providers alone cannot be the only source of FC distribution—there is a need to make provision of FC supplies more accessible, particularly for experienced users.

  • Although FCs are available widely in public health facilities, this level of health care is predominately utilised by women. Programmes need to ensure that FCs are more widely available to men.

  • FC distribution should be closely monitored to ensure consistency of supplies and preclude stock-outs.

  • Population-based national surveys should include the FC as a method disaggregated from the MC in order to monitor data on users’ knowledge and use.

  • Providers’ negative attitudes and lack of technical counselling skills regarding the FC need to be addressed to improve uptake.

  • The current branding of the FC is not appealing to men generally, especially those who have sex with men.

  • Awareness-raising and marketing must be a priority to build demand.

Conclusion

Findings from this evaluation provide solid support for further programme expansion in South Africa. As demand for FCs has increased, the NSP 2017-2022includes increased targets for condom distribution (3 billion MCs and 33 million FCs). Public-sector facilities have now been given distribution targets for both MCs and FCs (Personal communication, National Department of Health) and therefore this evaluation can be used to consider and address the realities of system, provider and client concerns. Years of limited distribution during the phased expansion of the programme may have conveyed to both providers and clients that FCs are not available at all sites, and that providers do not need to stock, promote and offer the product. The evaluation was conducted 20 years after the first sites had started distributing FCs and this was reflected in the range of years that each site had been participating in the programme. The final phase of expansion which occurred around 2012–2014 meant that some sites had been distributing for 2 years or less, whereas others had more r than 10 years of experience. Similarly, the phased introduction of new FC brands was evident during the FC evaluation, with some provinces not yet starting this component. Currently, all public-sector sites are distributing FCs. Sites with less experience in condom programming can learn from this evaluation. With the phased introduction of the donor-funded pre-exposure prophylaxis (PrEP) programme into South African health facilities, there are opportunities to learn and apply lessons learned from the evaluation of the national FC programme.

Supporting information

S1 File

(PDF)

S2 File

(PDF)

S3 File

(PDF)

S4 File

(PDF)

S5 File

(PDF)

Acknowledgments

We thank the South African Department of Health-national, provincial, district and participating facilities for support and assistance with project planning and logistics. We also want to thank all participants for their time and contribution, and the non-public sector sites for their willingness to take part in our evaluation.

Data Availability

All relevant data are within the manuscript and its Supporting Information files.

Funding Statement

This study was funded by the United States Agency for International Development (USAID) under AID-OAA-A-13-00069 (Mags Beksinska, PhD, and Jennifer A. Smit, PhD, Co- Principal Investigators). Joanne Mantell, MS, MSPH, PhD, was also supported by a center grant from the National Institute of Mental Health (P30-MH43520; Principal Investigator: Robert H Remien, PhD. The content of this article is solely the responsibility of the authors and does not necessarily represent the official views of USAID, and the HIV Center for Clinical and Behavioral Studies at the New York State Psychiatric Institute and Columbia University, Department of Psychiatry.

References

  • 1.Warren M. Condoms: the multipurpose prevention technologies that already exist. BJOG 2014; 121 (Suppl. 5): 9–11. [DOI] [PubMed] [Google Scholar]
  • 2.World Health Organization Department of Reproductive Health and Research (WHO/RHR), Johns Hopkins Bloomberg School of Public Health/Center for Communication Programs (CCP). Family Planning: A Global Handbook for Providers (2018 update) [Internet]. Baltimore and Geneva: CCP and WHO; 2018 [cited 2019 Jun 20]. 442 p. http://apps.who.int/iris/bitstream/10665/260156/1/9780999203705-eng.pdf?ua=1
  • 3.Wiyeh AB, Mome RKB, Mahasha PW, Kongnyuy EJ, Wiysonge CS. Effectiveness of the female condom in preventing HIV and sexually transmitted infections: a systematic review and meta-analysis. BMC Public Health. 2020;20(1):319 Published 2020 Mar 12. 10.1186/s12889-020-8384-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Peters A, Jansen W, van Driel F. The female condom: the international denial of a strong potential. Reproductive Health Matters 2010;18 (35):119–128. 10.1016/S0968-8080(10)35499-1 [DOI] [PubMed] [Google Scholar]
  • 5.Reproductive Health Supplies Coalition. Caucus on new and underused reproductive health technologies. Product brief: female condom. Reproductive Health Supplies Coalition; 2013. [Online]. [Cited: 19th Dec 2018]. URL: https://www.rhsupplies.org/fileadmin/uploads/rhsc/Working_Groups/New_Underused_RH_Technologies_Caucus/Documents/Technical_Briefs/rhsc-brief-female-condom_A4.pdf
  • 6.United Nations Population Fund. HIV prevention gains momentum. New York: UNFPA, 2011. 978-0-89714-933-4.
  • 7.Center for Health and Gender Equity. Female condoms and US foreign assistance: an unfinished imperative for women’s health. Washington, DC; 2011. [Google Scholar]
  • 8.Peterson K, Herman L, Marseille E, et al. Smarter Programming of the Female Condom: Increasing its Impact on HIV prevention in the Developing World. FSG Social Impact Advisors. [Online] 2008. [Cited: April 16, 2019.] URL: www.fsg-impact.org/app/content/ideas/item/Female_Condom_Impact.html [Google Scholar]
  • 9.UNFPA Prequalified Female Condom Products. https://www.unfpaprocurement.org/documents/10157/37547/UNFPA+Female+Condom+Prequalification+List/05feba45-4893-474a-81d4-7b61e4f68ae7 accessed 17th January 2019
  • 10.Beksinska M., Wong R., Smit J. Male and female condoms: Their key role in pregnancy and STI/HIV prevention. Best Practice & Research: Clinical Obstetrics & Gynaecology. 2019. December 14 10.1016/j.bpobgyn.2019.12.001 [Epub ahead of print]. [DOI] [PubMed] [Google Scholar]
  • 11.Mqhayi M, Beksinska M, Smit J, et al. Introduction of the female condom in SA: Program activities and performances 1998-2000.Durban: RHRU,FHI,NDoh,2003.
  • 12.Mantell JE, Scheepers E, Abdool Karim QA. Introducing the female condom through the public health sector: experiences from SA. AIDS Care 2000; 12 (5): 589–601. 10.1080/095401200750003770 [DOI] [PubMed] [Google Scholar]
  • 13.Beksinska M, Smit J, Mantell J. Progress and challenges in male and female condom use in SA. Sexual Health 2012; 9(1):51–58. 10.1071/SH11011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Ashmore J, Henwood R. Choice or no choice? The need for better branded public sector condoms in South Africa. South African Journal of HIV Medicine 2015; 16 (1), Art. #353, 3 pages. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Department of Health. HIV and AIDS and STI National Strategic Plan for South Africa, 2012–2016. Pretoria: South Africa Department of Health; 2011. [Google Scholar]
  • 16.Let Our Actions Count: Africa’s National Strategic Plan for HIV, TB and STIs 2017–2022. Pretoria: South Africa Department of Health; 2017. https://sanac.org.za/wp-content/uploads/2018/09/NSP_FullDocument_FINAL.pdf Department of Health.
  • 17.Shisana O, Rehle T, Simbayi LC, et al. South African National HIV Prevalence, Incidence, Behaviour and Communication Survey, 2008: a turning tide among teenagers?. Cape Town: HSRC Press; 2009. [Google Scholar]
  • 18.Haffejee F, Maharajh R. Addressing Female Condom Use among Women in South Africa: A Review of the Literature. International Journal of Sexual Health 10.1080/19317611.2019.1643813 [DOI] [PubMed] [Google Scholar]
  • 19.Bamford C, Brink A, Govender N, et al. Part V.GARP: Surveillance activities. SA Medical Journal 2011. July;101(8):579–582. [PubMed] [Google Scholar]
  • 20.Creswell JW, Klassen AC, Plano Clark VL, Smith, KC. Best practices for mixed methods research in the health sciences. Office of Behavioral and Social Sciences Research (OBSSR). URL: https://obssr.od.nih.gov/training/online-training-resources/mixed-methods-research/ Accessed 30 April 2020.
  • 21.Neilands TB and Choi K. A validation and reduced form of the female condom attitudes scale. AIDS Educ Prev 2002;14(2):158–71 10.1521/aeap.14.2.158.23903 [DOI] [PubMed] [Google Scholar]
  • 22.Lince- Deroche N, Zulu B, Roseleur J, et al. The Health Service Cost of Offering Female Condoms in SA’s National Female Condom Program 2015/16.Johannesburg:HERO Policy Brief Number 17, Health Economics and Epidemiology Research Office 2017. http://www.heroza.org/wp-content/uploads/2017/10/Policy-Brief-17-RSA-female-condom-program-costing-Sept-2017.pdf.Accessed 4th June 2018.
  • 23.National Condom Distribution Plan: A Guide for District-Level Managers, Health Care Practitioners and Implementers. 2013–2016. Dept of Health SA.
  • 24.Beksinska ME, Rees VH, Dickson-Tetteh KE, Mqoqi N, Kleinschmidt I, McIntyre JA. Structural integrity of the female condom after multiple uses, washing, drying and re-lubrication. Contraception. 2001;63(1)33–36. 10.1016/s0010-7824(00)00192-x [DOI] [PubMed] [Google Scholar]
  • 25.The Female Condom in Uganda A Situation Analysis. Ministry of Health Kampala Uganda February 2009. library.health.go.ug/download/file/fid/718
  • 26.Holt K, Blanchard K, Chipato T, et al. A nationally representative survey of healthcare provider counselling and provision of the female condom in SA and Zimbabwe. BMJ Open 2013; 3: e002208 10.1136/bmjopen-2012-002208 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.National Department of Health (NDoH), Statistics SA (Stats SA). SAn Medical Research Council (SAMRC), and ICF. 2107. SA Demographic and Health Survey 2016: Key Indicators. Pretoria, SA, and Rockville, Maryland, USA: NDoH, Stats SA, SAMRC, and ICF.
  • 28.Shisana O, Rehle T, Simbayi LC, et al. SA national HIV prevalence, incidence, behaviour and communication survey 2008: a turning tide among teenagers? Cape Town: Human Science Research Council Press; 2009. [Google Scholar]
  • 29.Guerra FM, Simbayi LC. Prevalence of knowledge and use of the female condom in SA. AIDS & Behaviour 2014. January; 18 (1):146–158. [DOI] [PubMed] [Google Scholar]
  • 30.Kalckmann S, Farias N, Carvalheiro J. Evaluation of continuity of use of female condoms among users of the Brazilian National Health System (SUS): longitudinal analysis in units in the metropolitan region of São Paulo, Brazil. Revista Brasileira de Epidemiologia 2009; 12(2), 132–143. URL: 10.1590/S1415-790X2009000200004 [DOI] [Google Scholar]
  • 31.Warren M. & Philpott A. Expanding Safer Sex Options. Introducing the Female Condom into National Programmes. Reproductive Health Matters 2003;11(21):130–9. 10.1016/s0968-8080(03)02178-5 [DOI] [PubMed] [Google Scholar]
  • 32.Bruinderink, MG, Janssens W. A randomized impact evaluation of a female condom programme in Mozambique—Results. Amsterdam Institute for Global Health and Development. March 2017. URL: https://www.aighd.org/wp-content/uploads/2017/09/ResearchBrief2of2_FemaleCondoms_120917.pdf Accessed April 20th 2018.
  • 33.Mantell JE, Dworkin S, Exner et al. The promises and limitations of female-initiated methods of HIV/STI protection. Social Science & Medicine 2006;63(8):1998–2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Schuyler AC, Masvawure TB, Smit JA, et al. Building young women’s knowledge and skills in female condom use: lessons learned from a SA intervention. Health Education Research 2016. April; 31(2): 260–272. Published online 2016 Mar 8. 10.1093/her/cyw001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Moore E, Beksinska M, Festin M, et al. Knowledge, Attitudes, Practices and Behaviors Associated with Female Condoms in Developing Countries: A Scoping Review. Open Access Journal of Contraception 2015:6 125–142 10.2147/OAJC.S55041 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Comprehensive Condom Programming: A guide for resource mobilization and country programming.URL: http://www.unfpa.org/publication/comprehensive-condom-programming
  • 37.Beksinska M, Nkosi P, Mabude Z, et al. Twenty years of the female condom program in SA: past, present and future. SA Health Review, 2017. Chapter 14 In: Padarath A, Barron P, editors. SA Health Review 2017. Durban: Health Systems Trust; 2017. URL: http://www.hst.org.za/publications/south-african-health-review-2017. Pg 147. [Google Scholar]

Decision Letter 0

Collins Iwuji

3 Dec 2019

PONE-D-19-26312

Global lessons from the evaluation of the South African National Female Condom Program

PLOS ONE

Dear Prof Beksinska,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

We would appreciate receiving your revised manuscript by Jan 17 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Collins Iwuji, M.B;B.S, MSc, MD

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

http://www.journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and http://www.journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

1. Please include a copy of the interview guide used in the study, in both the original language and English, as Supporting Information, or include a citation if it has been published previously.

2. We note that you have included the phrase “data not shown” in your manuscript. Unfortunately, this does not meet our data sharing requirements. PLOS does not permit references to inaccessible data. We require that authors provide all relevant data within the paper, Supporting Information files, or in an acceptable, public repository. Please add a citation to support this phrase or upload the data that corresponds with these findings to a stable repository (such as Figshare or Dryad) and provide and URLs, DOIs, or accession numbers that may be used to access these data. Or, if the data are not a core part of the research being presented in your study, we ask that you remove the phrase that refers to these data.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Overall, the paper addresses and reports various aspects of the national female condom program in South Africa. The authors have presented comprehensive data and provided valuable insights from a range of government key stakeholders, health providers, and clients from seemingly representative samples from South Africa. However, the paper has many methodological weaknesses, incomplete presentation of the data, and lack of cohesive discussions and writing, which significantly weaken the quality of the manuscript.

Major comments:

1. In the beginning of the Materials and Methods section, I think there should be a section which provides the overview of the FC programs and guideline in South Africa as a background. Specifically, when the national FC program was launched in 2014 and expanded thereafter, were IEC and “demonstration models” supposed to be distributed to all clinic sites (if so, one model for FC demonstration per clinic)? Also, at the clinics, what were the specific guidelines/policies for healthcare providers (HCWs) in terms of FCs distribution to clinic clients? Were HCWs supposed to provide FCs to every client who visit the clinics or only to certain clients (i.e. attending for sexual and reproductive health issues, etc.)? Does the policy specify to provide FCs to younger clients (aged 15-19 years) as well? In my opinion, very insufficient information was provided by the authors such that it is difficult to properly assess the paper’s results.

2. Line 79-85: Although the authors mention that “the on-site assessment sample was selected randomly and proportionally” by the four criteria, the authors do not present any distribution of characteristics of the selected sites regarding these four criteria thus readers cannot assess how well this selection was performed and the overall characteristics of these sites. Please add this information in Table 1. Also, given that the data was collected nationally, it seems important to understand geographical distributions of the sites, at least at the provincial level.

3. Line 87-91: What was the rationale to choose tertiary education and social-marketing outlet for the surveys? Can the authors please elaborate what they mean by “social-marketing outlet”? I think authors should provide more explanation about distribution of FCs at tertiary education, etc., in the introduction (in line 64-67). Also, please provide more explanation about the list of non-public FC distributing sites such as the number of available non-public sites- were the sites only included if the sites were providing free FCs?

4. Line 101: What does it mean by “depending on total staff complement”? Recommend re-writing from 93 to 98 with more details.

5. Line 116-118: It is unclear what the authors mean by “the same three months…”. Does this mean that the data was collected for three months at each site? I think there should be a section under Methods to describe data collection at the sites in more details.

6. Table 1: can the authors break down and provide the type of providers (for provider interviews and, if possible, key informants) and the type of organizations included in non-public sector sites (the number of NGOs vs. tertiary education vs. …)? Also please provide information on sites (such as the number and variance) for the sub-sample besides the number of individual participants.

7. Line 133: Can the authors please elaborate in which languages the consents/interviews/self-administered questionnaires were offered? Given that this was done at the national level, I am not sure how many different languages were offered for questionnaires… and there is no explanation about translation of the languages in Methods.

9. Line 147-161: I think this information needs to be separately presented under “overview of national government policies and programs.”, and some of them (i.e. line 156-161) are more like a review of policy implementation process, rather than results. Also, how were these key informant interviews conducted? Were they semi-structured qualitative in-depth interviews? How were these interviews analysed (i.e. using what methods/analysis programs)? In Line 152, please specify and refer to the new FC brands.

10. Line 163-169: It’s really unclear to which data these results were referring. Are these based on the review of DHIS data? If so, please clearly state that.

11. Table 2: Please include p-values to make proper comparison between FC and MC. It’s very unclear whether the percentage in Table 2 presents the availability of FC or MC only OR the availability of any of condom, IEC materials, or demonstration model OR all of them. I strongly suggest to present results for availability of each of condom, IEC materials, and demonstration model separately in Table 2.

-Line 187: “higher” is not clear. What is the overall availability of the FCs vs. MCs?

-Line 188: please report the percentage of “FC leaflet availability”.

12. Line 209: Table 3: Please include p-values as the last column to compare between FCs vs. MCs.

13. Line 203: I think looking at the association among the different predictors is a completely different research question. To look at the predictors, I strongly believe that adjusted models such as logistic regression models adjusting for different predictors and characteristics need to be done. For line 204-206, if the reasons were asked in the surveys, please include them in Table 3. Also, please provide whether the provider interviews were done qualitatively or quantitatively (using survey forms) in the Methods. For the question ‘”In last month provider discussed…”, Being depended on the client vs. the frequency of providing the one-to-one sessions) are NOT mutually exclusive.

Furthermore, the question on “Demonstrate use to new users” seems confusing- was this referring to any time period or like in last month?

14. Line 212-217: Again, I strongly think that predictors related to provision of FCs need to be investigated in adjusted models including availability of condoms in every consultation room, being ever trained, types of providers, different provinces, experience of stock-outs, etc as potential predictors.

15. Line 223-228: I would like to see the complete Table for providers attitudes towards FCs and MCs, at least as a supplement, as well as p-value associated with that. Especially, I think this is really the key information given that the authors’ main conclusion that “providers are the gate-keepers” for FCs distribution.

16. Line 226-228: How were the question really asked to the respondents? I don’t think it would have been asked as “messy” or “weird” in the questionnaire… or was it? How was this concept asked in the survey? Or is it based on the qualitative interviews/findings? It it’s latter, I think very explicit methods need to be written regarding how this was conducted and analysed.

17. Line 237-242: How were these participants included? Was the self-administered survey offered to everyone who was visiting the clinic sites? What was the refusal vs. participation percentages? Also, given that this was done at the national level, I would like to see how these may differ by province, at least. Also, please include more demographic characteristics of clients (definitely, at least age groups, HIV status). Please include p-values for difference between women and men. Also, in the discussion, the authors discuss that “a high proportion of FC users were HIV-positive”… without any demographic characteristics presented in the results, it is not possible for readers to understand these results.

18. Line 250. Table 4. Please rephrase the title of the table, for example, “Clients’ knowledge and usage of FC in South Africa”.

19. The number of “Ever used FC” is obviously wrong (currently written as n=88 for women and n=587 for men), please correct this. Also, I also strongly recommend to report how many were current vs. previous FC users under the variable “Ever used FC”.

So.. out of 587+88 = 675 who ever used FCs, 427 (63%) completed the exit surveys? How were these people selected? Were they any demographic or systematic difference between those who completed the exist surveys vs. those who didn’t? What were the reasons for not completing the exit interviews?

20. Line 257-259: Please include this as part of Table 4 (i.e. Reasons for not using FCs among non FC-users* - then please give the detail description for this group as a footnote)

21. Line 271: Please carefully go through the table and see whether the percentage matches and adds up to 100%; for example, for the variable “First FC obtained from”, the percentages only add up to 95%- if other providers or sources were listed/selected, please list them as “others” category in the table. Also, I would rename “Offered FC or asked for first FC” as “Mode of receiving the first FC at clinics”.

22. Also please include the contents reported in 275-280 as part of Table 5 so the readers can understand fully.

23. Line 315-319: In the first few paragraphs of “Discussion” and throughout the Results section, the authors seem to point out gaps in the implementation and delivery of FC programs at national clinics then without much data supported, the authors claim that the evaluation seems to have fulfilled all the recommendation steps by the international organization. Could the authors please elaborate how the SA program has fulfilled the recommended steps in more details? Also please add an overall conclusion section at the end of line 331.

Minor comments:

- I think the word “global” lesson can be misleading as the study presents the data from South Africa alone. I suggest to remove the word “global” in this case.

- Line 95: what does IEC stand for? Information, Education, and Communication? Please state at least once before using the acronym.

- Line 284: Please change “Conclusion” to “Discussion”.

- Please check the references for consistency and correct formats.

- There are many grammatical errors and incomplete sentences. The paper needs to be proofread. For example,

Line 49: put -s after “sexually transmitted infection” and introduce acronym here, and thereafter use STI (for example, in Line 79)

Line 57: please spell out WHO/UNFPA when used for the first time

Line 57: need a reference for “others in development”.

Line 67: please put a reference.

Line 79: please replace to “the national STIs sentinel surveillance sites”

Line 80: includes -> include

Line 95: sites/stock outs/expired stock/sub-distribution to other sites; please do not use “/”,

Line 112: Remove “KIs” – I don’t think the authors used this acronym after this

Line 124: detail -> details

Line 110: ever heard of -> being ever heard of or using FC; the sentences need to be re-written.

Line 134: Results and � Results

Line 135: Please provide months when the data collection started (in 2014) and ended in 2016.

Line 151: Please remove – after “with”

Table 2: remove “day of telephonic survey” ; remove “condoms” after “Distribution points”; Please put the list of acronym as footnotes under the table. Please put the label as %(n) in the second row.

Line 200: and -> or

Line 304: Please rephase “… a rare example of one funded primarily by the National Government”- to, for example, “a rare example of the program primarily funded by the government.” Please add references.

Line 309-10. “Notable… in 2013” seems out of context or provide insufficient details.

Line 311-313: Difficult to read and awkwardly written. Please revise.

Throughout the paper, the authors keep using the word “variable”. I would recommend to diversity the term.

Reviewer #2: This is an interesting paper evaluating one of the only national female condom provision programmes globally. As such it has the potential to give important programmatic insights. However, there are issues with the way the methods are described, and the current structure of the results means there is a lack of clarity in the findings. This paper requires substantial revision prior to being considered for publication.

1. INTRODUCTION: a short paragraph introducing FC more broadly (efficacy, effectiveness, global uptake, acceptability) would provide important context.

2. METHODS:

i. This is described as a mixed methods evaluation comprising surveys, interviews and on site assessments. However, it appears that the interviews are structured (rather than qualitative interviews). It is important to be aware that mixed methods research pertains to the combination of QUANTITATIVE AND QUALITATIVE data in a study.

ii. Could the authors clarify exactly what types of data (quant and qual) they collected?

iii. If it is just quantitative data collected then this is NOT a mixed methods evaluation and the methods need to be amended to reflect this.

iv. If the interviews (key informant, provider and client) are qualitative then please (a) describe these components in detail (b) give a justification for why a mixed methods approach was used (c) describe the mixed methods model you are using (i.e. convergent parallel design) and (d) discuss how quant and qual data were integrated (which needs to be done in mixed methods analysis) . I suggest the authors look at O’Cathain, A., Murphy, E., and Nicholl, J. (2008) ‘The quality of mixed methods studies in health services research’, Journalof Health Services Research and Policy, vol. 13, no. 2, pp. 92-98.

v. Data sources are currently unclear - I suggest listing each sample (e.g. nationwide sites), sub-samples of sites etc (and then for each listign exactly what data collection methods were used). How were key informants selected?

vi. More detail is required on: surveys (what was asked, standardised, any validated tools e.g. attitudinal questions), interviews (were they quantitative or qualitative), on site assessments (what did this entail, who conducted them)., how were surveys administered, who conducted the interviews

vii. More details on DHIS sata source

3. Data analysis: if qualitative data were analysed how were these analysed, how were qual and quant data integrated, more details on stats (key variables, key outcomes of interest, you present p values - state you are using chi squared tests - wondering why you did not consider multivariable logistic models)

4. Line 134: incomplete subheading

5. Results need to be re structured for clarity as currenlty hard to know what are sources of data - start with broad overview from DHIS and national telephone survey, then KI and provider, and then client. If you have qualitative data , consider use of quotes to support your results.

6. Tables: present p-values for comparisons please

7. You cannot assess predictors as this is a cross sectional design - you are looking at associations.

8. I am struck by the fact that 15% of providers did nOT know that FC were not reusable - these findings need to be brought out more.

9. Line 240 fewer rather than less

10.Line 259 "where"

11. I was unsure what the exit interviews were

12. Conclusions: strengths and weaknesses of this evaluation, recommendations for further work, highlight the important gaps in knowledge and attitudes towards FC - how can this be addressed.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Shema Tariq

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Aug 13;15(8):e0236984. doi: 10.1371/journal.pone.0236984.r002

Author response to Decision Letter 0


1 Jun 2020

Editor

PloS One

We appreciate the thoughtful comments of the two reviewers and below we respond by-by-point in bold font to each comment or query. Reviewers’ comments were also a catalyst for substantial editing and reorganization. We have uploaded both highlighted and clean versions of the manuscript. Reference to lines refer to those in the clean version of the manuscript.

1. Please include a copy of the interview guide used in the study, in both the original language and English, as Supporting Information, or include a citation if it has been published previously.

We have included a copy of our Interview Guide for Key Informants as Supplementary Material. This guide has not been published.

2. We note that you have included the phrase “data not shown” in your manuscript. Unfortunately, this does not meet our data sharing requirements. PLOS does not permit references to inaccessible data. We require that authors provide all relevant data within the paper, Supporting Information files, or in an acceptable, public repository. Please add a citation to support this phrase or upload the data that corresponds with these findings to a stable repository (such as Figshare or Dryad) and provide and URLs, DOIs, or accession numbers that may be used to access these data. Or, if the data are not a core part of the research being presented in your study, we ask that you remove the phrase that refers to these data.

We have deleted reference to “data not shown” from the text. We have added a figure showing the data.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Partly

________________________________________

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: Yes

________________________________________

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

________________________________________

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: Yes

________________________________________

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Authors comment: We have been asked to add a lot of information and the article has double din length and so have the number of tables we had so many changes its was almost impossible to view in tack changes and so we have highlighted the additional sections in yellow.

Reviewer #1: Overall, the paper addresses and reports various aspects of the national female condom program in South Africa. The authors have presented comprehensive data and provided valuable insights from a range of government key stakeholders, health providers, and clients from seemingly representative samples from South Africa. However, the paper has many methodological weaknesses, incomplete presentation of the data, and lack of cohesive discussions and writing, which significantly weaken the quality of the manuscript.

Major comments

1. In the beginning of the Materials and Methods section, I think there should be a section which provides the overview of the FC programs and guideline in South Africa as a background. Specifically, when the national FC program was launched in 2014 and expanded thereafter, were IEC and “demonstration models” supposed to be distributed to all clinic sites (if so, one model for FC demonstration per clinic)? Also, at the clinics, what were the specific guidelines/policies for healthcare providers (HCWs) in terms of FCs distribution to clinic clients? Were HCWs supposed to provide FCs to every client who visit the clinics or only to certain clients (i.e. attending for sexual and reproductive health issues, etc.)? Does the policy specify to provide FCs to younger clients (aged 15-19 years) as well? In my opinion, very insufficient information was provided by the authors such that it is difficult to properly assess the paper’s results.

Given the vast amount of data to synthesize, we were challenged with how much information and data to include in this manuscript. We have now added additional back ground information at the end of the Introduction (Lines 70-95) rather than the beginning of the methods as suggested, as this section starts with a discussion of the FC globally and then provides FC information about South Africa specifically. There were no written guidelines regarding a focus on a particular type of client or key population. The FC pilot programme commenced in 1998, not in 2014, and was phased into a full-scale national programme in 2014. We have updated the history of the FC programme to improve clarity of its development.

2. Lines 79-85: Although the authors mention that “the on-site assessment sample was selected randomly and proportionally” by the four criteria, the authors do not present any distribution of characteristics of the selected sites regarding these four criteria thus readers cannot assess how well this selection was performed and the overall characteristics of these sites. Please add this information in Table 1. Also, given that the data was collected nationally, it seems important to understand geographical distributions of the sites, at least at the provincial level.

We have revised Table 1 (which is a new table) to include an overview of the data collection methods, whether all sites or sub-set of sites, key constructs presented in this paper and target population. We developed new tables (Table 2 & 3) to present characteristics of the sites by province.

3. Lines 87-91: What was the rationale to choose tertiary education and social-marketing outlet for the surveys? Can the authors please elaborate what they mean by “social-marketing outlet”? I think authors should provide more explanation about distribution of FCs at tertiary education, etc., in the introduction (in Lines 64-67). Also, please provide more explanation about the list of non-public FC distributing sites such as the number of available non-public sites- were the sites only included if the sites were providing free FCs?

We selected a tertiary education site because all tertiary education sites in South Africa are targeted for the FC (and MC) programme. The government of South Africa also launched socially marketed FC (managed by the local affiliate of Population Services International [PSI]), selling FCs branded under the same name as socially marketed male condoms known as “Lovers Plus”. A “Social-marketing outlet” in our study was defined as a retail outlet (store/petrol station etc) that sold female condoms. We added more information about both tertiary institutions and the social marketing introduction in the Introduction Section (Lines 88-96) and also in the Selection of Sites Section (Lines 109 143). All FCs are provided free in South Africa (public sector, NGOs, tertiary institutions). The private sector market rarely stocks FCs.

4. Line 101: What does it mean by “depending on total staff complement”? Recommend re-writing from 93 to 98 with more details.

Total staff complement refers to the total number of staff at the facility. We have edited this sentence and added more detail on this. We interviewed two staff when total staff employed were less than five and three staff when the total staff employed were five or more. (Lines 198-199)

5. Lines 116-118: It is unclear what the authors mean by “the same three months…”. Does this mean that the data was collected for three months at each site? I think there should be a section under Methods to describe data collection at the sites in more details.

We have added more details about data collection methods in the Methods Section. We reviewed the DHIS data for the same three months (February through April 2014). (Lines 173-174)

6. Table 1: can the authors break down and provide the type of providers (for provider interviews and, if possible, key informants) and the type of organizations included in non-public sector sites (the number of NGOs vs. tertiary education vs. …)? Also please provide information on sites (such as the number and variance) for the sub-sample besides the number of individual participants.

The majority of providers were nurses (73%) as stated in the text; the rest comprised one or more of 13 other types of positions and we have included all categories in new Table 5. We have added Key informants into Table 2. We have, as above, provided more detail on sites by province but not individual site as there are too many of them.

7. Line 133: Can the authors please elaborate in which languages the consents/interviews/self-administered questionnaires were offered? Given that this was done at the national level, I am not sure how many different languages were offered for questionnaires… and there is no explanation about translation of the languages in Methods.

The client exit interview and client anonymous survey were offered to participants in all of South Africa’s 11 languages; similarly, the consent for the client exit interview was offered in all 11 languages. This is noted in the Methods Section. (Lines 146-148)

9. Lines 147-161: I think this information needs to be separately presented under “overview of national government policies and programs.”, and some of them (i.e. line 156-161) are more like a review of policy implementation process, rather than results. Also, how were these key informant interviews conducted? Were they semi-structured qualitative in-depth interviews? How were these interviews analysed (i.e. using what methods/analysis programs)? In Line 152, please specify and refer to the new FC brands.

We have presented this information in a new section, Overview of national government policies and programmes. (Lines 288-346)

We have added more information about how these interviews were conducted in the Methods Section. (Lines 155-167)

Analysis of these interviews is described in the Data Analysis Section. (Lines 224-231)

We have mentioned the new FC brands in the Introduction Section. (Lines 79-80)

10. Line s163-169: It’s really unclear to which data these results were referring. Are these based on the review of DHIS data? If so, please clearly state that.

Lines 163-169 do not refer to the DHIS. The DHIS data were only used to verify the distribution figures given for the 3-month site distribution to see if they matched what the clinic had documented in their distribution logs. We have added a sub-header for this section starts that we are talking about data from the site assessment to make it clearer. (Lines 169-174)

11. Table 2: Please include p-values to make proper comparison between FC and MC. It’s very unclear whether the percentage in Table 2 presents the availability of FC or MC only OR the availability of any of condom, IEC materials, or demonstration model OR all of them. I strongly suggest to present results for availability of each of condom, IEC materials, and demonstration model separately in Table 2.

We have presented the condom availability, IEC and demonstration models separately for the male and female condom in Table 4.

The lower part of the table was inadvertently deleted. We have added the p-values to the table.

12. Line 187: “higher” is not clear. What is the overall availability of the FCs vs. MCs?

Table 4 presents the proportion of each type of distribution point in each facility where condoms are available to clients. We have edited the text to explain that higher means available to clients at that distribution point. (line 378)

-Line 188: please report the percentage of “FC leaflet availability”. We have added FC leaflet availability in Table 4.

12. Line 209: Table 3: Please include p-values as the last column to compare between FCs vs. MCs.

We added p values in new Table 4 (previously table 3).

13. Line 203: I think looking at the association among the different predictors is a completely different research question. To look at the predictors, I strongly believe that adjusted models such as logistic regression models adjusting for different predictors and characteristics need to be done. For line 204-206, if the reasons were asked in the surveys, please include them in Table 3. Also, please provide whether the provider interviews were done qualitatively or quantitatively (using survey forms) in the Methods. For the question ‘”In last month provider discussed…”, Being depended on the client vs. the frequency of providing the one-to-one sessions) are NOT mutually exclusive.

Furthermore, the question on “Demonstrate use to new users” seems confusing- was this referring to any time period or like in last month?

There are a number of point asked in the above point. We have answered them in order here:-We believe that descriptive data are sufficient for this evaluation and that its merit is not diminished by the lack of logistic regression analyses. The protocol did not specify conducting logistic regression and the sample size was not powered to do so. At the time of receiving the sample we were not aware of the number of providers in those facilities and we were only in each facility for a short time and often many did not have time for an interview. Therefore providers were purposively sampled.

We are not sure what was meant by the reasons in table 3 (now table 4) in the reviewers question above. The exact options for each question is in the table?

Demonstrate to new users means users who have never used an FC before

We have added the responses (clients can read instructions, no time in consultation and no demonstration model) to Table 6 (previously table 4).

The provider survey was quantitative; we have noted this in the Methods Section. (Line 197)

The response to “discussed in last month” was often responded to by the provider as “depended on client”; this is a common response as providers often do not want to discuss condom use with particular types of clients, e.g., folder or married women. This was a verbatim response was reported by nearly a quarter of providers regarding the female condom and nearly a fifth with regard to the male condom. We therefore consider this to be a valid response and have opted not to exclude this response category. We have added an explanatory footnote at the bottom of this table. (now Table 6)

The question asking about demonstration to new users was not asked within a time frame (last month, etc). We assumed that the providers’ response would refer to what they usually do in terms of demonstrating FC use or not to new users. Other questions specified a window period, including asking about discussing condom use in last month and for having personally given condoms to a male or female client in last week.

14. Line 212-217: Again, I strongly think that predictors related to provision of FCs need to be investigated in adjusted models including availability of condoms in every consultation room, being ever trained, types of providers, different provinces, experience of stock-outs, etc as potential predictors.

This type of analysis was not described in the protocol and we believe it the present descriptive analyses have merit. It would be complex as there is a variety of number and types of consultation rooms. Some consultation rooms were limited in space and so condoms had to be kept in reception or in toilets or with security guards. The type of provider, aside from nurses, included about 20 different categories many with only one in each and so it would not be possible to do this type of analysis (pharmacy assistant, project manager, volunteer, condom champion etc). Also staffing differs along the line of superiority and having an administrative load, resulting in some staff attending to clients for longer periods than others.

There was almost no stock-outs (2.8%) of sites and when they occurred they were often for a range of reasons which are now described in the Results Section (Lines 351-354). Our sample size was adequate at national level but we feel its inadequate at provincial level (9 provinces) as each province was so different we do not want to make too many speculations between provinces. The provinces are so different, population-wise, geographically (urban vs. rural). We were trying to give a national overview in this paper. Finally, with the different variables and target populations across data collection, methods, adjusted analyses and a meta-analysis seem less important than the analyses we have presented in this manuscript.

15. Line 223-228: I would like to see the complete Table for providers attitudes towards FCs and MCs, at least as a supplement, as well as p-value associated with that. Especially, I think this is really the key information given that the authors’ main conclusion that “providers are the gate-keepers” for FCs distribution.

We have added a new table (Table 7) to the main text. We have not looked at the association of these items with ever trained in FC as many did not know the information as they had not been trained

16. Line 226-228: How were the question really asked to the respondents? I don’t think it would have been asked as “messy” or “weird” in the questionnaire… or was it? How was this concept asked in the survey? Or is it based on the qualitative interviews/findings? It it’s latter, I think very explicit methods need to be written regarding how this was conducted and analysed.

The items are ‘female condoms are messy’ and ‘female condoms are weird’. These items were derived from Neilands and Choi (2002) and have been used in other studies in sub-Saharan Africa, although not with providers. Reference is now provided.

17. Line 237-242: How were these participants included? Was the self-administered survey offered to everyone who was visiting the clinic sites? What was the refusal vs. participation percentages? Also, given that this was done at the national level, I would like to see how these may differ by province, at least. Also, please include more demographic characteristics of clients (definitely, at least age groups, HIV status). Please include p-values for difference between women and men. Also, in the discussion, the authors discuss that “a high proportion of FC users were HIV-positive”… without any demographic characteristics presented in the results, it is not possible for readers to understand these results.

The survey was offered to everyone in the waiting area in the morning of the visit. Staff left them at the reception, left them on tables and chairs. People did not need to refuse and so we do not have these data. The facilities we visited were often very busy and it would not have been possible to offer individually (Lines 201-209). HIV status was not collected in this anonymous client survey as we felt it may dissuade participation. The questionnaire focused on FC use. We collected data only on age and gender (Lines 510 onwards). Table 8 (previously table 4) presents survey data by participant gender. The purpose of the survey was to look at uptake. We also have added tabled showing the distribution of data collection method by province and another table displaying site location, facility type, years of FC distribution (Table 3), and more than 1 type of FC available at site for each province (Table 3).

18. Line 250. Table 4. Please rephrase the title of the table, for example, “Clients’ knowledge and usage of FC in South Africa”.

We have renamed the title of Table 8 (previously table 4) as suggested.

19. The number of “Ever used FC” is obviously wrong (currently written as n=88 for women and n=587 for men), please correct this. Also, I also strongly recommend to report how many were current vs. previous FC users under the variable “Ever used FC”.

We agree and realize that we had inadvertently entered incorrect numbers for men and women. This has been corrected in Table 8 (previously table 4). The question in the survey asked about use with current partner. We have added this information to the table.

So.. out of 587+88 = 675 who ever used FCs, 427 (63%) completed the exit surveys? How were these people selected? Were they any demographic or systematic difference between those who completed the exist surveys vs. those who didn’t? What were the reasons for not completing the exit interviews?

The client exit survey was not linked to the client anonymous survey. The populations for the two surveys were different. Therefore, we cannot say that 63% of those who completed the exit survey had ever used the DC. During the facility visit, the research staff informed clients during the day in the different waiting areas that any ever or current users could volunteer for an interview at the end of their consultation. We have added this information in the Client exit interviews section (Lines 211-219). The exit interviews took almost an hour (line 216) and meant that clients would have had to stay after their consultation; this may not have been possible for all clients.

The anonymous survey only took a minute or two to complete. Since it was anonymous, we did not want to target clients based on demographic or condom use characteristics. In fact, we wanted to identify how many clients had ever or never used FCs. Although some of the anonymous survey participants also completed the exit interview, we did not record if they had or not so as linkage would have violated the anonymity of the survey (Lines 201-202). We did not collect data on the number of clients who refused participation in the anonymous client survey or client exit survey. The female clients who participated in the exit interviews were women who self-selected to be interviewed.

20. Line 257-259: Please include this as part of Table 4 (i.e. Reasons for not using FCs among non FC-users* - then please give the detail description for this group as a footnote)

We have added the reasons for non-use in a new Figure 2 (Line 543), but we do not have a detailed description of these participants as only age and sex data were collected. We would not have been able to collect detailed on all 4,000+ clients. We made the anonymous survey was designed to be brief and simple as possible.

21. Line 271: Please carefully go through the table and see whether the percentage matches and adds up to 100%; for example, for the variable “First FC obtained from”, the percentages only add up to 95%- if other providers or sources were listed/selected, please list them as “others” category in the table. Also, I would rename “Offered FC or asked for first FC” as “Mode of receiving the first FC at clinics”.

Percentages do not necessarily add up to 100% in Table 9 if multiple responses are allowed (e.g., main reasons for first FC use). Percentages for source of obtaining first FC and mode of receiving the first FC, and if offered choice of condom total 100%. In Table 9 we have renamed the Offered FC to Mode of receiving FC.

22. Also please include the contents reported in 275-280 as part of Table 5 so the readers can understand fully.

We have added some of the information in the text to Table (previously table 5) but we did not want to repeat it all in the text and table as most of the text information was simply a yes or no answer.

23. Line 315-319: In the first few paragraphs of “Discussion” and throughout the Results section, the authors seem to point out gaps in the implementation and delivery of FC programs at national clinics then without much data supported, the authors claim that the evaluation seems to have fulfilled all the recommendation steps by the international organization. Could the authors please elaborate how the SA program has fulfilled the recommended steps in more details? Also please add an overall conclusion section at the end of line 331.

We have added the information into the discussion as suggested. We have mentioned the required steps and how the program has fulfilled them (Lines 611-622).

Minor comments:

- I think the word “global” lesson can be misleading as the study presents the data from South Africa alone. I suggest to remove the word “global” in this case.

We have removed the word ‘global’.

- Line 95: what does IEC stand for? Information, Education, and Communication? Please state at least once before using the acronym.

We have written out IEC (Line 182)

.

- Line 284: Please change “Conclusion” to “Discussion”.

We have changed the Conclusion header to Discussion header .

- Please check the references for consistency and correct formats.

We have checked the references for consistency of format and corrected them.

- There are many grammatical errors and incomplete sentences. The paper needs to be proofread. For example,

Line 49: put -s after “sexually transmitted infection” and introduce acronym here, and thereafter use STI (for example, in Line 79) done

Line 57: please spell out WHO/UNFPA when used for the first time done

Line 57: need a reference for “others in development”. added

Line 67: please put a reference. added

Line 79: please replace to “the national STIs sentinel surveillance sites” done

Line 80: includes -> include done

Line 95: sites/stock outs/expired stock/sub-distribution to other sites; please do not use “/”, removed

Line 112: Remove “KIs” – I don’t think the authors used this acronym after this. We have retained ‘KI’ as we have used this term more than once.

Line 124: detail -> details done

Line 110: ever heard of -> being ever heard of or using FC; the sentences need to be re-written.

Line 134: Results and � Results We do not understand what the Reviewer is suggesting.

Line 135: Please provide months when the data collection started (in 2014) and ended in 2016. Done

Line 151: Please remove – after “with” done

Table 2: remove “day of telephonic survey”; done remove “condoms” after “Distribution points”; done Please put the list of acronym as footnotes under the table these are now in text. Please put the label as % (n) in the second row. done

Line 200: and -> or done

Line 304: Please rephase “… a rare example of one funded primarily by the National Government”- to, for example, “a rare example of the program primarily funded by the government.” Please add references. Done and added reference

Line 309-10. “Notable… in 2013” seems out of context or provide insufficient details. done

Line 311-313: Difficult to read and awkwardly written. Please revise. reworded

Throughout the paper, the authors keep using the word “variable”. I would recommend to diversity the term. revised

Reviewer #2: This is an interesting paper evaluating one of the only national female condom provision programmes globally. As such it has the potential to give important programmatic insights. However, there are issues with the way the methods are described, and the current structure of the results means there is a lack of clarity in the findings. This paper requires substantial revision prior to being considered for publication.

1. INTRODUCTION: a short paragraph introducing FC more broadly (efficacy, effectiveness, global uptake, acceptability) would provide important context.

We have added more information about the FC in the Introduction Section. (Lines 51-60; 71-96)

2. METHODS:

i. This is described as a mixed methods evaluation comprising surveys, interviews and on site assessments. However, it appears that the interviews are structured (rather than qualitative interviews). It is important to be aware that mixed methods research pertains to the combination of QUANTITATIVE AND QUALITATIVE data in a study.

The Key Informant interviews were qualitative. We have noted this in the Methods Section and in Table 1. In our revisions, we added more information form the key informant interviews (lines 254-167) and integrated them in sections that also report quantitative data.

ii. Could the authors clarify exactly what types of data (quant and qual) they collected?

We have added more information in the Methods Section. The key informant interview was the only qualitative tool. The other tools were quantitative.

iii. If it is just quantitative data collected then this is NOT a mixed methods evaluation and the methods need to be amended to reflect this.

Please see response in As noted above in 2i and 2ii.

iv. If the interviews (key informant, provider and client) are qualitative then please (a) describe these components in detail (b) give a justification for why a mixed methods approach was used (c) describe the mixed methods model you are using (i.e. convergent parallel design) and (d) discuss how quant and qual data were integrated (which needs to be done in mixed methods analysis) . I suggest the authors look at O’Cathain, A., Murphy, E., and Nicholl, J. (2008) ‘The quality of mixed methods studies in health services research’, Journal of Health Services Research and Policy, vol. 13, no. 2, pp. 92-98.

We have added more detail about all data collection methods and a new table (Table 1) . As we stated above, the evaluation used a mixed-methods approach. All of the data collection used quantitative tools, with the exception of the Key Informant Interview tool. The mixed-methods evaluation used a convergent concurrent or parallel) design. We used a mixed-methods approach, albeit limited qualitative data collection, to elicit a better understanding of the context and enrich perspectives of system-level FC issues (Lines 151-152) (which is better served by use of qualitative methods). We limited use of qualitative methods because of the need for timely completion of the evaluation.

v. Data sources are currently unclear - I suggest listing each sample (e.g. nationwide sites), sub-samples of sites etc (and then for each listing exactly what data collection methods were used). How were key informants selected?

vi. More detail is required on: surveys (what was asked, standardised, any validated tools e.g. attitudinal questions), interviews (were they quantitative or qualitative), on site assessments (what did this entail, who conducted them)., how were surveys administered, who conducted the interviews

vii. More details on DHIS data source

We have added more detail on all data collection methods in the text (Lines 127 onwards) and in a new table, Table 1. We note that all data collection was conducted by a cadre of research interviewers trained in quantitative and qualitative methods . Information on DHIS has been expanded (Lines 169-173).

3. Data analysis: if qualitative data were analysed how were these analysed, how were qual and quant data integrated, more details on stats (key variables, key outcomes of interest, you present p values - state you are using chi squared tests - wondering why you did not consider multivariable logistic models)

See response to Reviewer 1 comment #14

4. Line 134: incomplete subheading

This incomplete heading has been corrected.

5. Results need to be re structured for clarity as currently hard to know what are sources of data - start with broad overview from DHIS and national telephone survey, then KI and provider, and then client. If you have qualitative data, consider use of quotes to support your results.

We have made the sources of data clearer with sub-headings and started with the broader overview. Key informant data were discussed first in the Results section because the data are overarching and not province- or site-specific.

6. Tables: present p-values for comparisons please

We have added p-values for comparisons in Tables 4 and 8.

7. You cannot assess predictors as this is a cross sectional design - you are looking at associations.

We agree that any bivariate analyses look at associations of variables and have deleted reference to predictors.

8. I am struck by the fact that 15% of providers did nOT know that FC were not reusable - these findings need to be brought out more.

Research on the first polyurethane FC (FC1) that indicated it could be reused several times after washing and drying. However, as the material in FC2 changed from polyurethane to synthetic latex, this was no longer possible. We have added this to the Discussion Section (Line 577 583 ).

9. Line 240 fewer rather than less changed

10.Line 259 "where" changed

11. I was unsure what the exit interviews were

Exit interviews were conducted after clients had completed their visit to the clinic. We have explained this (Lines 206-213).

12. Conclusions: strengths and weaknesses of this evaluation, recommendations for further work, highlight the important gaps in knowledge and attitudes towards FC - how can this be addressed.

________________________________________

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Shema Tariq

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.

________________________________________

In compliance with data protection regulations, you may request that we remove your personal registration details at any time. (Remove my information/details). Please contact the publication office if you have any questions.

Attachment

Submitted filename: Response to reviewers June 01 Plos one.docx

Decision Letter 1

Collins Iwuji

20 Jul 2020

LESSONS FROM THE EVALUATON OF THE SOUTH AFRICAN NATIONAL  FEMALE CONDOM PROGRAMME

PONE-D-19-26312R1

Dear Dr. Beksinska,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Collins Iwuji, M.B;B.S, MSc, MD

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Title of Table 1 is missing. Please include.

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for the opportunity to re-review the manuscript by Beksinska et al. The authors have well addressed my comments and strengthened the manuscript significantly by including more details about the methodology and study results. Overall, I appreciate clarity by including more detailed information but some of the information presented in the main text can be considered for the supplement. Especially, with clarify in the methods for collecting different data sources, I think the readers can now fully appreciate the comprehensive works done by the authors. So I'd recommend the results section and tables in the main text focus on the comprehensive findings across different data sources from the study and include any additional tables (for example, Table 2 and 3) presented in the supplement as per the authors' and the editor's judgement.

Minor comments:

1. Heading for Table 1 seems to be missing on page 7.

2. Table 2 and 3 are very helpful for the readers to understand the characteristics of selected sites but if the number of Tables need to be reduced, I believe these can be presented in supplement and briefly summarized in the main text (as the authors presented but perhaps even in a more summarized fashion).

3. I think the quote in line 304-310 can be deleted as the authors well summarize this in line 301-302. Also, I am not sure much detail regarding the NDoH’s policy reviews are needed to be included in the line 299-318 under the Results section.

4. For the quotes from the key informant interviews, it could be more informative if the acronym can be explained a little bit more. For example, “FC-NGO-KI-GP01” can be “A general practitioner in NGO, KI” or “DIS KZN01” can be “a district manager, KZN”.

5. Table 5 is helpful but can be grouped into a fewer number of categories for meaningful difference among job roles. Also, this can be included as a supplement if needed.

6. Likewise, Table 7 is helpful to understand the results but given the length of the questionnaire, I wonder the results need to be included as a supplement or perhaps sub-grouped or presented in a simpler format? I defer that decision to the authors and the editor.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Acceptance letter

Collins Iwuji

29 Jul 2020

PONE-D-19-26312R1

LESSONS FROM THE EVALUATON OF THE SOUTH AFRICAN NATIONAL  FEMALE CONDOM PROGRAMME

Dear Dr. Beksinska:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Collins Iwuji

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 File

    (PDF)

    S2 File

    (PDF)

    S3 File

    (PDF)

    S4 File

    (PDF)

    S5 File

    (PDF)

    Attachment

    Submitted filename: Response to reviewers June 01 Plos one.docx

    Data Availability Statement

    All relevant data are within the manuscript and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES