Skip to main content
PLOS ONE logoLink to PLOS ONE
. 2020 Jul 17;15(7):e0235823. doi: 10.1371/journal.pone.0235823

Health management information system (HMIS) data verification: A case study in four districts in Rwanda

Alphonse Nshimyiryo 1,*, Catherine M Kirk 1, Sara M Sauer 2, Emmanuel Ntawuyirusha 3, Andrew Muhire 3, Felix Sayinzoga 4, Bethany Hedt-Gauthier 1,2,5
Editor: Dejing Dou6
PMCID: PMC7367468  PMID: 32678851

Abstract

Introduction

Reliable Health Management and Information System (HMIS) data can be used with minimal cost to identify areas for improvement and to measure impact of healthcare delivery. However, variable HMIS data quality in low- and middle-income countries limits its value in monitoring, evaluation and research. We aimed to review the quality of Rwandan HMIS data for maternal and newborn health (MNH) based on consistency of HMIS reports with facility source documents.

Methods

We conducted a cross-sectional study in 76 health facilities (HFs) in four Rwandan districts. For 14 MNH data elements, we compared HMIS data to facility register data recounted by study staff for a three-month period in 2017. A HF was excluded from a specific comparison if the service was not offered, source documents were unavailable or at least one HMIS report was missing for the study period. World Health Organization guidelines on HMIS data verification were used: a verification factor (VF) was defined as the ratio of register over HMIS data. A VF<0.90 or VF>1.10 indicated over- and under-reporting in HMIS, respectively.

Results

High proportions of HFs achieved acceptable VFs for data on the number of deliveries (98.7%;75/76), antenatal care (ANC1) new registrants (95.7%;66/69), live births (94.7%;72/76), and newborns who received first postnatal care within 24 hours (81.5%;53/65). This was slightly lower for the number of women who received iron/folic acid (78.3%;47/60) and tested for syphilis in ANC1 (67.6%;45/68) and was the lowest for the number of women with ANC1 standard visit (25.0%;17/68) and fourth standard visit (ANC4) (17.4%;12/69). The majority of HFs over-reported on ANC4 (76.8%;53/69) and ANC1 (64.7%;44/68) standard visits.

Conclusion

There was variable HMIS data quality by data element, with some indicators with high quality and also consistency in reporting trends across districts. Over-reporting was observed for ANC-related data requiring more complex calculations, i.e., knowledge of gestational age, scheduling to determine ANC standard visits, as well as quality indicators in ANC. Ongoing data quality assessments and training to address gaps could help improve HMIS data quality.

Introduction

National health management information systems (HMIS) have been established in many low- and middle-income countries (LMICs) for routine collection and management of facility-based data on health care service delivery [1,2]. When the data are of good quality, they can be used–with little-to-no costs–to identify areas that need improvement, to evaluate various health interventions, to inform evidence-based health policies, and to design programs and allocate resources at all levels of the health system [35]. However there is evidence of variable quality of HMIS data [613], limiting its value in monitoring, evaluation and research in LMICs.

Regular data quality assessment is one of the strategies that can be used for improving the quality of HMIS data in LMICs [14,15]. The World Health Organization (WHO) provides guidelines on data quality review (DQR) through a desk review [16] of data which was previously reported to the HMIS and the verification of HMIS data quality through facility survey [17]. The desk review assesses HMIS data quality in four dimensions: 1) completeness and timeliness of data, 2) internal consistency of reported data, 3) external consistency; and 4) external comparisons of population data. Detailed information on definitions and requirements for the calculation of these dimensions, as well as the application of HMIS data quality assessment through the desk review can be reviewed elsewhere [13,16,18]. The WHO toolkit for data quality review defines a verification factor (VF) as the ratio of recounted data from facility source documents over HMIS data [17]. The application of the VF has been more limited and using variable definitions. However, there is evidence that the level of agreement between HMIS data and records in facility source documents can vary depending on the type of data being collected and be rooted in early stages of collecting those data from facility source documents [11,1924]. Over- or under-reporting in HMIS data can result from human errors that occur when counting events from source documents or simply not including all events for the reporting period–by omitting some of the necessary source documents or not covering the entire reporting period [11,24]. In addition, intentional over- or under-reporting in HMIS data at the facility level can be motivated by the pressure to meet national targets, whereas, inaccuracies in transferring data from facility source documents to the electronic database can also be associated with excessive workload of staff combined with the pressure to meet reporting deadlines [23].

The Rwanda HMIS was established in 1998 and since 2012, with the goal to improve the quality of routinely-collected health data from community health workers (CHWs) and all HFs across the country, the Rwanda Ministry of Health (MoH) upgraded the HMIS to a web-based system known as the District Health Information System version 2 (DHIS2) [25]. Recent assessments of the Rwanda HMIS data quality using WHO guidelines have been limited to a desk review of available data in the HMIS [13,26]. These assessments revealed that the quality of the Rwanda HMIS data is high, regarding completeness and internal consistency of reported data for studied HMIS indicators [26]. However, findings from two small-scale studies that tried to compare HMIS data and records in source documents, using a reporting accuracy definition different to the WHO data verification definition, suggest a variable level of agreement between HMIS data and records in source documents [27,28]. One of these two studies that defined accuracy of reporting in HMIS data as a deviation less than or equal to 5% between HMIS and facility source documents data on family planning, antenatal care and delivery, found the overall accuracy of reporting at 70.6% for 37 HFs sampled in three districts in the Eastern Province of Rwanda [28].

Good quality HMIS data would play an important role in identifying areas that need improvement and monitoring progress and evaluating interventions that are designed to address those gaps [2931]. Rwanda experienced a remarkable progress in reducing mortality among children aged under-five in the past two decades, however the reduction in neonatal deaths (those occurring within 28 days of birth and mainly in health facilities) has been slow [32]. There is no doubt that good quality HMIS data on maternal and newborn healthcare are needed to identify gaps in the existing facility-based care for designing appropriate interventions and for monitoring progress [33]. Since 2013, the “All Babies Count (ABC)” intervention has been implemented to accelerate the reduction in preventable neonatal deaths in Rwanda [34]. The ABC intervention focuses on improved coverage and quality of antenatal, maternity and postnatal care services. ABC is implemented through a joint collaboration of Partners In Health/Inshuti Mu Buzima (PIH/IMB), an international non-profit organization and the Rwanda MoH. However, the evaluation of the impact of these programs has been costly, with a parallel collection of data on program indicators through HMIS and review of facility source documents, given the concerns about poor quality of HMIS data [34]. Therefore, this study uses the WHO guidelines for HMIS data verification to provide evidence on the level of agreement between Rwanda HMIS data and records in source documents of reporting HFs. We calculate VFs for fourteen HMIS data elements that were identified jointly by PIH/IMB and MoH as priority indicators for quality improvement in maternal and newborn health care for 76 HFs (7 hospitals and 69 health centers) that received the ABC intervention between 2017 and 2019 in four districts of Rwanda. The criteria for indicator selection included having clinical relevance to neonatal survival, government priority indicators, and/or being indicators within the WHO standards for improving quality in maternal and newborn care in health facilities [35].

Methods

Study design

This was a cross-sectional study that was conducted to assess the quality of Rwanda HMIS data, measured as agreement between HMIS and facility source documents data, for 76 HFs on 14 HMIS data elements related to maternal and newborn health data that were used to monitor quality and progress and to inform quality improvement efforts through the ABC intervention (Table 1). The antenatal care (ANC) register was the source document for 5 data elements that were only reported by health centers (HCs), while the maternity and postnatal care (PNC) registers were source documents for 7 documents that were reported by both HCs and hospitals. Two data elements related to neonatal admissions and deaths were recounted from the neonatology care unit (NCU) register and were only reported by hospitals.

Table 1. Description of data elements and source of data.

Data element Description Source of data
Data elements only reported by health centers
ANC1 Number of antenatal care (ANC) new registrants ANC register
ANC1_standard Number of women with ANC1 standard visit (defined in Rwanda as the first ANC visit at <16 weeks’ gestation) ANC register
Syphilis_test Number of women tested for syphilis at ANC1 ANC register
Iron/FA Number of women received iron/folic acid at ANC1 ANC register
ANC4_standard Number of women with a 4th ANC standard visit ANC register
Data elements reported by both health centers and hospitals
Deliveries Number of deliveries in health facilities Maternity register
Live_Births Number of live births Maternity register
PNC1_Newborn Number of newborns who received postnatal care (PNC) visit 1 within 24 hours of birth PNC register
Data elements with rare events reported by both health centers and hospitals
Stillbirths Number of stillbirths (fresh and macerated) Maternity register
Didn’t_Cry Number of live newborns who didn’t cry at birth Maternity register
Resusc_Succ Number of live newborns who didn’t cry at birth and were resuscitated successfully Maternity register
LBW Number of low birth weight babies Maternity register
Data elements only reported by hospitals
Neo_Admissions Number of neonatal admissions in the hospital neonatology care unit (NCU) NCU register
Neo_Deaths Number of neonatal deaths in the hospital neonatology care unit (NCU) NCU register

Study setting

This study included 48 HFs in Gakenke and Rulindo districts in Northern Rwanda and 28 HFs in Gisagara and Rusizi districts in Southern and Western Rwanda, respectively. The 76 HFs were grouped into seven hospital catchment areas (HCAs): Nemba District Hospital (15 HFs), Ruli District Hospital (10 HFs), Kinihira Provinical Hospital (9 HFs), Rutongo District Hospital (14 HFs), Gakoma District Hospital (6 HFs), Kibilizi District Hospital (10 HFs) and Mibilizi District Hospital (12 HFs). These MoH-operated facilities were included because they received the ABC intervention, and represented 14% (69/499) of health centers and 15% (7/48) of the hospitals in all 30 districts in Rwanda [36].

The ABC project was originally implemented in 2013 by Partners in Health/Inshuti Mu Buzima (PIH/IMB) in Kayonza and Kirehe districts in Eastern Rwanda, in partnership with the Rwanda MoH [34]. Later on, ABC was scaled-up to Gakenke and Rulindo (July 2017) and then to Gisagara and Rusizi (October 2017). The ABC scale-up project used health facility-based data, collected monthly through the Rwanda HMIS, to monitor indicator progress from baseline to the end of the project and to evaluate its impact. In addition, the project underwent the HMIS data verification process by recounting the same data in standardized HF registers for the same data elements and reporting periods. Five data elements related to antenatal care (ANC) services (ANC new registrants, syphilis testing and iron/folic acid distribution at the first ANC visit and number of women with first and fourth ANC standard visits) were only reported by health centers, whereas two data elements on the number of neonatal admissions and neonatal deaths in the hospital neonatology care unit were specific to hospitals [37]. Data are recorded using standardized registers developed by the MoH and provided to all HFs (see Table 1 for data sources); women attending ANC are recorded at their first ANC visit and provided an ANC card and an ANC number that facilitates continuity of data recording at the individual level for ANC. This study reports data from the baseline period prior to ABC intervention in the 7 HCAs.

Sources of data

HMIS data

The ABC team worked with the MoH national HMIS team to extract HMIS reports data for the study periods. The HMIS data collection starts from the reporting facility, with clinical staff in each care service registering patients/clients and the care provided to them in standardized registers and/or medical files [38,39]. Then, for monthly reporting to HMIS, the facility data manager ensures the distribution of paper HMIS reporting forms to heads of services by the 25th of each month. The head of service collects those data that are relevant to their specific service and submits a completed HMIS report for the previous month to the facility data manager by the 3rd day of the month following the month of reporting. For timely reporting, the facility data manager should upload all facility data into DHIS 2 by the 5th day of every month. Data verification by facility team and corrections in the system are only allowed between the 5th and 15th of each month. Any request for changes on the data in the system beyond the 15th of each month should be submitted to the central MoH, and access is only granted upon strong justification of the request.

Recounted data from facility source documents

A specialized team of two trained ABC data collectors went to all HFs under study and recounted the same data in the standardized HF source documents for the same reporting periods. ABC baseline data—April-June 2017 for 48 HFs in Gakenke and Rulindo districts and July-September 2017 for 28 HFs in Gisagara and Rusizi—were collected during the periods July 31-September 19, 2017 and November 14, 2017-January 11, 2018, respectively. In particular, due to observed variable unit of recording of gestational age (GA) in weeks or months in the ANC register by facility and care provider, ABC data collectors worked with midwives or nurses responsible for providing ANC to standardize the calculation of GA in weeks before recounting data on ANC1 and ANC4 standard visits, as the reporting to HMIS on these data elements is based on GA calculated in weeks. The data collection team used a pregnancy wheel and the data recorded on date of last menstrual period and dates of ANC visits for individual women who attended ANC to determine GA at each visit. For all data elements, data collection was done in consultation with the health facility staff responsible for routine reporting of data into HMIS.

Data analysis and presentation

We used the WHO DQR guidelines on data verification and system assessment to calculate verification factors (VFs) for each data element [17]. A VF was defined as the ratio of register data to HMIS data (Eq 1). ABC baseline data were aggregated for the three-month reporting period. HMIS and facility source documents data were compared by data element and HF. At the HCA level, a VF was calculated by summing all the non-missing values for each data element and all the reporting HFs under that HCA during the study period. Then, a HCA-level VF was calculated as a ratio of the aggregated recounted data to HMIS data. In addition, a VF for data elements with rare events was only calculated at the HCA level, where aggregated data were compared to avoid denominators with true zero values that would be expected if these data were compared at the HF level. For each data element, we excluded from our analyses any HFs that were not eligible for reporting on that data element or that had either incomplete HMIS data or missing source documents’ data for any month during the reporting period.

VF=recountednumberofeventsfromfacilitysourcedocumentsReportednumberofeventsfromtheHMIS Eq (1)

A VF of 1.00 indicated a perfect match between recounted data and HMIS data. The acceptable margin of error for the discrepancy between HMIS reports data and recounted data in facility registers was (0.90≤VF≤1.10), based on the WHO DQR guidelines. A VF<0.90 or VF>1.10 indicated over-reporting and under-reporting in HMIS data, respectively. We used Stata v.15.1 (Stata Corp, College Station, TX, USA) and R Language and Environment for Statistical Computing for analysis and visual presentation of data [40].

Ethics

The ABC scale-up project received approval from the Rwanda MoH to access HMIS data for the project’s indicators for all HFs that received the intervention. This study was approved by the Rwanda National Ethics Committee approval (Kigali, Rwanda, protocol #0067/RNEC/2017). As this study was completed using de-identified routinely-collected aggregate data, informed consent was not required.

Results

The proportion of HFs with all data sources by data element for the three-month period was the lowest for iron/folic (87.0%; 60/69), while the proportion of facilities with complete reporting in HMIS was between 90.8% and 100%, and was the lowest for the first postnatal care (PNC1) (85.5%; 65/76) (Table 2). A facility level HMIS data VF was calculated for eight data elements (Table 2 and Fig 1). A high proportion of HFs achieved acceptable VFs for the following data elements: the number of deliveries (98.7%; 75/76), antenatal care (ANC1) new registrants (95.7%; 66/69), live births (94.7%; 72/76), and newborns who received PNC1 visit within 24 hours (81.5%; 53/65). The median VF was 1.00 (interquartile range [IQR]: 0.99–1.00) for HMIS data on deliveries, 1.00 (IQR: 1.00–1.00) for ANC1 data, 1.00 (IQR: 0.99–1.00) for HMIS data on live births and 1.00 (IQR: 0.97–1.02) for PNC1 data.

Table 2. Facility-level verification factors (VF) by data element.

Data element name (n)a Facilities that provide each service and with complete source documents’ data, n (%) Facilities that provide each service and with complete HMIS data, n (%) Facilities for which source data exactly match HMIS data (VF = 1) n (%) Facilities with an acceptable margin of error for discrepancies between HMIS and facility source documents data 0.90≤VF≤1.10 n (%) Facilities that over-reported by more than 10% (VF <0.90) n (%) Facilities that under-reported by more than 10% (VF >1.10) n (%) Median facility-level VF Median [IQR]
ANC1b (n = 69) 69 (100.0) 69 (100.0) 49 (71.0) 66 (95.7) 3 (4.3) 0 (0.0) 1.00 [1.00–1.00]
ANC1_standardc (n = 68) 69 (100.0) 68 (98.6) 5 (7.3) 17 (25.0) 44 (64.7) 7 (10.3) 0.85 [0.67–0.99]
Syphilis_testd (n = 68) 68 (98.6) 69 (100.0) 25 (36.8) 46 (67.6) 18 (26.5) 4 (5.9) 1.00 [0.89–1.00]
Iron/FAe (n = 60) 60 (87.0) 69 (100.0) 29 (48.3) 47 (78.3) 9 (15.0) 4 (6.7) 1.00 [0.99–1.00]
ANC4_standardf (n = 69) 69 (100.0) 69 (100.0) 2 (2.9) 12 (17.4) 53 (76.8) 4 (5.8) 0.75 [0.56–0.88]
Deliveriesg (n = 76) 76 (100.0) 76 (100.0) 42 (55.3) 75 (98.7) 1 (1.3) 0 (0.0) 1.00 [0.99–1.00]
Live_Birthsh (n = 76) 76 (100.0) 76 (100.0) 39 (51.3) 72 (94.7) 4 (5.3) 0 (0.0) 1.00 [0.99–1.00]
PNC1_Newborni (n = 65) 71 (93.4) 69 (90.8) 10 (15.4) 53 (81.5) 5 (7.7) 7 (10.8) 1.00 [0.97–1.02]

a Final number of reporting facilities with both complete source documents and HMIS data; antenatal care (ANC)-related services are only provided at the health center, so reports were complete if n = 69 health centers reported and n = 76 for services which are provided at both health centers and hospitals

b Number of antenatal care (ANC) new registrants

c Number of women with first ANC standard visit

d Number of women tested for syphilis at first ANC visit

e Number of women who received iron/folic acid at first ANC visit

f Number of women with a fourth ANC standard visit

g Number of deliveries in health facilities

h Number of live births; and

i Number of newborns who received postnatal care (PNC) visit 1 within 24 hours of birth

Fig 1. Facility-level verification factors (VF) by data element.

Fig 1

The median is not visible when it is very close to or equal to 1.

The proportion of HFs with acceptable VFs was lower for the number of women who received iron/folic acid (78.3%; 47/60) and the number of women who were tested for syphilis on ANC1 (67.6%; 46/68). The median VF was 1.00 (IQR: 0.99–1.00) for iron/folic acid distribution and 1.00 (0.89–1.00) for syphilis testing, respectively. For HFs with a VF out of the acceptable range, 9 in 13 (69.2%) over-reported on iron/folic acid distribution, while 18 in 22 (81.8%) HFs over-reported on HMIS data for syphilis testing on ANC1.

The indicators for which the lowest proportion of HFs obtained acceptable VFs were the number of women with an ANC1 standard visit (at less than 16 weeks’ gestational age (GA)) (25.0%; 17/68) and the number of women who completed 4 standard ANC visits (ANC4 standard: ANC1 at less than 16 weeks GA, ANC2 at between 24–28 weeks, ANC3 at between 28–32 weeks and ANC4 at between 36–38 weeks) (17.4%; 12/69). The median VF for HMIS data on ANC1 standard visit was 0.85 (IQR: 0.67–0.99) and 0.75 (IQR: 0.56–0.88) for ANC4 standard visit data. Of the majority of HFs that over- or under-reported on ANC1 standard (75.0%; 51/68) and ANC4 standard (82.6%; 57/69), over-reporting was at 93.0% (53/57) and 86.3% (44/51) for ANC4 and ANC1 standard visits.

When data were aggregated at the HCA level (Table 3), all 7 HCAs achieved acceptable VFs for HMIS data on the number of ANC1, deliveries, live births and newborns who received PNC1 within 24 hours of birth. Five and four in seven HCAs achieved acceptable VFs for HMIS data on the number of pregnant women who received iron/folic acid and were tested for syphilis on ANC1, respectively. Two in seven HCAs achieved acceptable VFs for HMIS data on the number of women with ANC1 standard visit. With a VF ranging between 0.59 and 0.86, none of the HCAs obtained an acceptable VF, and all HCAs over-reported on HMIS data, for the number of women with ANC4 standard visit.

Table 3. Hospital catchment-level verification factor (VF) for aggregated data by data element, a VF>1.10 or <0.90 is highlighted.

Data element Nemba DH* Ruli DH Kinihira PH** Rutongo DH Gakoma DH Kibilizi DH Mibilizi DH
Number of antenatal care (ANC) new registrants 0.97 1.01 0.99 1.00 1.00 0.93 0.98
Number of women with first ANC standard visit 0.82 0.82 0.90 0.93 0.80 0.51 0.72
Number of women tested for syphilis at first ANC visit 0.79 0.65 1.03 0.96 0.95 0.86 1.10
Number of women who received iron/folic acid at first ANC visit 0.95 1.01 0.99 0.81 0.99 0.95 0.89
Number of women with a fourth ANC standard visit 0.68 0.83 0.73 0.86 0.82 0.59 0.69
Number of deliveries in health facilities 1.00 1.00 1.00 1.00 1.00 0.95 1.00
Number of live births 1.00 0.99 1.00 1.00 0.95 0.95 0.99
Number of newborns who received postnatal care (PNC) visit 1 within 24 hours of birth 1.00 0.99 1.00 0.98 1.03 0.91 0.99

*DH: District Hospital

**PH: Provincial Hospital

A VF was calculated at the HCA level only for the six HMIS data elements with rare events or for data elements concerning a service that was only provided at the hospital level (Table 4). Six in seven hospitals achieved acceptable VFs for the number of admissions in the hospital NCU and 5 in 7 hospitals obtained an acceptable VF for the number of neonatal deaths in NCU. The majority (5/7) of sites also achieved acceptable VFs for the number of stillbirths and the number of babies with low weight at birth. Less than half (3/7) of sites obtained acceptable VFs for the number of live newborns who needed resuscitation and were resuscitated successfully. Only 2 in 7 sites achieved an acceptable VF for the number of live newborns who didn’t cry at birth and the majority (4/7) of sites under-reported on this data element.

Table 4. Hospital catchment-level verification factors (VF) for data elements with rare events, a VF>1.10 or <0.90 is highlighted.

Data element Nemba DH* Ruli DH Kinihira PH** Rutongo DH Gakoma DH Kibilizi DH Mibilizi DH
Number of stillbirths (fresh and macerated) 0.89 1.00 0.88 1.00 1.04 1.06 1.00
Number of live newborns who didn’t cry at birth 1.19 1.49 1.17 1.16 0.93 1.10 0.77
Number of live newborns who didn’t cry at birth and were resuscitated successfully 0.96 1.48 1.04 1.20 1.19 1.06 0.67
Number of low birth weight babies 1.10 1.00 1.23 1.07 1.10 1.20 0.93
Number of neonatal admissions in the NCU 1.00 1.00 1.00 0.92 1.00 1.00 0.88
Number of neonatal deaths in the NCU 1.22 1.00 1.07 1.08 1.00 0.96 1.38

*DH: District Hospital

**PH: Provincial Hospital

Discussion

In this study, we assessed the quality of the Rwanda HMIS data, measured as the level of agreement between HMIS data and records in facility source documents, using data from 76 public HFs, in Northern, Southern and Western Rwanda. Fourteen HMIS data elements were selected for this study, considering their importance in identifying gaps and monitoring progress towards the improvement of maternal and newborn health for reducing preventable neonatal deaths in Rwanda. Our findings suggested several strengths while also observing variation in the quality of HMIS data by type of data element, which is consistent with other HMIS DQA studies in Rwanda and other Sub-Saharan African countries [27,41].

Notably, this verification showed high level of agreement between data reported to HMIS and records in facility source documents for the number of ANC1, deliveries and live births. These data elements are among the WHO recommended core indicators for DQR in HMIS data on maternal health [17]. The quality of Rwanda HMIS data on these data elements is higher than what was found in HMIS data verifications for the same data in Ethiopia [41] and Nigeria [42] and similar for the ANC1 indicator in Malawi [22]. This verification of the Rwanda HMIS data also showed similar patterns of data quality of HMIS data elements at the facility level and when HFs were grouped by HCA. This finding may indicate that there were common challenges for accurate reporting to HMIS, regardless of the HF geographical location. This is a different finding to that found by other studies that revealed quality of routinely collected health data in Africa varied by geographical location of reporting HFs [18,43]. Consistent level of HMIS data quality across Rwanda may be the result of adherence of HFs to the Rwanda MoH’s existing standard operating procedures for high quality HMIS data [25] and performance based financing system that includes regular review of facility records with specific focus on maternal and child health [44].

However, there was poor quality of HMIS data on the number of ANC1 and ANC4 standard visits with a general trend of over-reporting. A systematic review of immunization data quality identified insufficient human resources and limited healthcare worker capacity for reporting and using data as key issues that contribute to poor data quality [23]. The accuracy of reporting on these data elements might be dependent on the health care provider’s knowledge of how to calculate the pregnancy’s gestational age in weeks and how to correctly schedule the ANC standard visits, as well as the availability of tools, mainly pregnancy wheels, that facilitate these calculations. It was observed by the study data collection team, that data in registers were recorded in ways that were not compatible with HMIS reporting, for example–gestational age at first visit recorded in the register in months and then reported into HMIS based on cutoffs in weeks which may contribute to challenges in reporting incompatible information out of the source documents into HMIS. In addition, an analysis of the ABC baseline data on the availability of essential medical equipment and supplies at facilities that received the ABC intervention, revealed that only 44.9% (31/69) of health centers were having a pregnancy wheel in the pre-intervention period. The low quality of HMIS data on these data elements might be justified by the findings of a recent study on quality of ANC services provision in HFs conducted in 13 sub-Sahara African countries, including Rwanda [45]. Using data from Service Provision Assessments (SPA) and Service Availability and Readiness Assessments (SARA) surveys, it was revealed that the proportion of HFs providing ANC services with ANC-trained staff was less than half in 6 of the 13 countries. In addition, the median proportion of HFs with ANC guidelines was 62.3%. In Rwanda in particular, only 31.2% of 432 facilities providing ANC services had ANC guidelines and only 79.8% of these facilities had staff trained in ANC [45]. This may particularly affect the observed over-reporting on the number of women who completed four ANC standard visits which requires also knowing the standard visit schedule for ANC to know whether women completed the four visits at the correct time, versus just reporting the number of women who came for four visits at any time.

Further, the over-reporting of key quality of care elements such as iron/folic acid supplementation and syphilis testing in ANC, are concerning. In addition to challenges of correct reporting of ANC coverage identified in this study, these data elements [46] are included to help understand the content and quality of ANC visits that women receive. These are critical interventions for prevention of stillbirths, genetic abnormalities, and poor neonatal outcomes [47,48]. Over-reporting masks an important problem that has major implications for the health of women and newborns.

Study limitations

First, this study included only 14 Rwanda HMIS data elements related to maternal and newborn care and HFs that were selected to receive the ABC scale-up intervention, and this non-random sample might not be representative of all facilities in Rwanda. However, the fact that these facilities were located in different parts of the country, and that the results from this study show similar variations in data quality by HMIS data element across all geographical locations, we are confident that the findings of this study can help with understanding the level of agreement between HMIS data and facility source documents’ records for the considered data elements. We also believe that the findings of this study will be useful to other studies for the verification of more HMIS data elements or on a larger scale in Rwanda. Second, this study only assessed the quality of HMIS data by focusing on the concordance between HMIS and facility source documents data. The true accuracy of the source documents is not known, and is a critical component of data quality that requires further evaluation.

Conclusions

Findings of this study suggest the variation of HMIS data quality by data element and similar patterns of reporting accuracy, irrespective of the geographical locations of a health facility. Reporting to HMIS was less accurate for some data elements, particularly those that are more complex to generate. This challenge to accurate reporting by HFs has implications for decision-making on key interventions affecting maternal and newborn outcomes. Ongoing regular data quality assessments, promoting the use of HMIS data for quality improvement in health care delivery at the facility level, and training to address gaps could help improve HMIS data to be used in program evaluations.

Supporting information

S1 Dataset

(DTA)

Acknowledgments

We acknowledge the contributions of Robert M. Gatsinzi and Ibrahim Hakizimana from the monitoring and evaluation team for data collection in health facility source documents. We are also grateful to the leadership, nurses and midwives of health facilities involved in this study, for their critical support in facilitating data collection for this study.

Data Availability

All relevant data are within the manuscript and its Supporting Information files.

Funding Statement

The author (s) received no specific funding for this work. This study was implemented as part of the All Babies Count (ABC) intervention scale-up in Rwanda between 2017 and 2019, and data collection activities were funded by Grand Challenges Canada Saving Lives at Birth.

References

  • 1.MEASURE Evaluation. Using DHIS 2 to Strengthen Health Systems. Chapel Hill, NC: MEASURE Evaluation, University of North Carolina; 2017. Available: https://www.measureevaluation.org/resources/publications/fs-17-212 [Google Scholar]
  • 2.University of Oslo. DHIS2 factsheet. 2018. Available: https://s3-eu-west-1.amazonaws.com/content.dhis2.org/general/dhis-factsheet.pdf
  • 3.AbouZahr C, Boerma T. Health information systems: the foundations of public health. Bulletin of the World Health Organization. 2005;83: 578–583. [PMC free article] [PubMed] [Google Scholar]
  • 4.Nyamtema AS. Bridging the gaps in the Health Management Information System in the context of a changing health sector. BMC Medical Informatics & Decision Making. 2010;10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Wickremasinghe D, Hashmi IE, Schellenberg J, Avan BI. District decision-making for health in low-income settings: a systematic literature review. Health Policy and Planning. 2016;31 10.1093/heapol/czv124 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Ahanhanzo YG, Ouedraogo LT, Kpozehouen A, Coppieters Y, Makoutode M, Wilmet-Dramaix M. Factors associated with data quality in the routine health information system of Benin. Archives of Public Health. 2014;72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Foster M, Bailey C, Brinkhof MW, Graber C, Boulle A, Spohr M, et al. Electronic medical record systems, data quality and loss to follow-up: survey of antiretroviral therapy programs in resource-limited settings. Bulletin of the World Health Organization. 2008;86: 939–947. 10.2471/blt.07.049908 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Garrib A, Stoops N, McKenzie A, Dlamini L, Govender T, Rohde J, et al. An evaluation of the District Health Information System in rural South Africa. South African Medical Journal. 2008;98: 549–522. [PubMed] [Google Scholar]
  • 9.Makombe SD, Hochgesang M, Jahn A, Tweya H, Hedt B, Chuka S, et al. Assessing the quality of data aggregated by antiretroviral treatment clinics in Malawi. Bulletin of the World Health Organization. 2008;86: 310–314. 10.2471/blt.07.044685 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Maokola W, Willey BA, Shirima K, Chemba M, Armstrong Schellenberg JRM, Mshinda H, et al. Enhancing the routine health information system in rural southern Tanzania: successes, challenges and lessons learned. Tropical Medicine and International Health. 2011;16: 721–730. 10.1111/j.1365-3156.2011.02751.x [DOI] [PubMed] [Google Scholar]
  • 11.Mate KS, Bennett B, Mphatswe W, Barker P, Rollins N. Challenges for Routine Health System Data Management in a Large Public Programme to Prevent Mother-to-Child HIV Transmission in South Africa. PLoS ONE. 2009;4 10.1371/journal.pone.0005483 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Mavimbe JC, Braa J, Bjune G. Assessing immunization data quality from routine reports in Mozambique. BMC Public Health. 2005;5 10.1186/1471-2458-5-108 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Maïga A, Jiwani SS, Mutua MK, Porth TA, Taylor CM, Asiki G, et al. Generating statistics from health facility data: the state of routine health information systems in Eastern and Southern Africa. BMJ Global Health. 2019;4: e001849 10.1136/bmjgh-2019-001849 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Mutale W, Chintu N, Amoroso C, Awoonor-Williams K, Phillips J, Baynes C, et al. Improving health information systems for decision making across five sub-Saharan African countries: Implementation strategies from the African Health Initiative. BMC Health Services Research. 2013;13 10.1186/1472-6963-13-S2-S9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Mphatswe W, Mate K, Bennett B, Ngidi H, Reddy J, Barker P, et al. Improving public health information: a data quality intervention in KwaZulu-Natal, South Africa. Bulletin of the World Health Organization. 2012;90: 176–182. 10.2471/BLT.11.092759 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.World Health Organization. Data quality review: a toolkit for facility data quality assessment. Module 2. Desk review of data quality. 2017.
  • 17.World Health Organization. Data quality review: a toolkit for facility data quality assessment Module 3. Data verification and system assessment. Geneva: World Health Organization; 2017. [Google Scholar]
  • 18.Ouedraogo M, Kurji J, Abebe L, Labonté R, Morankar S, Bedru KH, et al. A quality assessment of Health Management Information System (HMIS) data for maternal and child health in Jimma Zone, Ethiopia. Ginsberg SD, editor. PLOS ONE. 2019;14: e0213600 10.1371/journal.pone.0213600 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Bosch-Capblanch X, Ronveaux O, Doyle V, Remedios V, Bchir A. Accuracy and quality of immunization information systems in forty-one low income countries. Tropical Medecine and International Health. 2009;14: 2–10. 10.1111/j.1365-3156.2008.02181.x [DOI] [PubMed] [Google Scholar]
  • 20.Ronveaux O, Rickert D, Hadler S, Groom H, Lloyd J, Bchir A, et al. The immunization data quality audit: verifying the quality and consistency of immunization monitoring systems. Bulletin of the World Health Organization. 2005;83: 503–510. [PMC free article] [PubMed] [Google Scholar]
  • 21.Cambodia. Assessment of health facility data quality: Data quality report card. Cambodia; 2012. Available: http://www.who.int/healthinfo/KH_DataQualityReportCard_2012.pdf
  • 22.O’Hagan R, Marx MA, Finnegan KE, Naphini P, Ng’ambi K, Laija K, et al. National Assessment of Data Quality and Associated Systems-Level Factors in Malawi. Global Health: Science and Practice. 2017;5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Wetherill O, Lee C, Dietz V. Root Causes of Poor Immunisation Data Quality and Proven Interventions: A Systematic Literature Review. 2017;2: 7. [PMC free article] [PubMed] [Google Scholar]
  • 24.Venkateswaran M, Mørkrid K, Abu Khader K, Awwad T, Friberg IK, Ghanem B, et al. Comparing individual-level clinical data from antenatal records with routine health information systems indicators for antenatal care in the West Bank: A cross-sectional study. Agyepong I, editor. PLOS ONE. 2018;13: e0207813 10.1371/journal.pone.0207813 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Republic of Rwanda. Data Quality Assessment Procedures Manual. Ministry of Health; 2016.
  • 26.Nisingizwe MP, Iyer HS, Gashayija M, Hirschhorn LR, Amoroso C, Wilson R, et al. Toward utilization of data for program management and evaluation: quality assessment of five years of health management information system data in Rwanda. Global Health Action. 2014;7 10.3402/gha.v7.25829 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Mitsunaga T, Hedt-Gauthier BL, Ngizwenayo E, Bertrand Farmer D, Gaju E, Drobac P, et al. Data for Program Management: An Accuracy Assessment of Data Collected in Household Registers by Community Health Workers in Southern Kayonza, Rwanda. Journal of Community Health. 2015;40: 625–632. 10.1007/s10900-014-9977-9 [DOI] [PubMed] [Google Scholar]
  • 28.Karengera I, Onzima RAD, Katongole S-P, Govule P. Quality and Use of Routine Healthcare Data in Selected Districts of Eastern Province of Rwanda. International Journal of Public Health Research. 2016;4: 5–13. [Google Scholar]
  • 29.Brandrud AS, Schreiner A, Hjortdahl P, Helljesen GS, Nyen B, Nelson EC. Three success factors for continual improvement in healthcare: an analysis of the reports of improvement team members. BMJ Quality and Safety. 2011;20: 251–259. 10.1136/bmjqs.2009.038604 [DOI] [PubMed] [Google Scholar]
  • 30.Tunçalp Ö, Were W, MacLennan C, Oladapo O, Gülmezoglu A, Bahl R, et al. Quality of care for pregnant women and newborns-the WHO vision. BJOG. 2015. 10.1111/471-0528.13451 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Iyer HS, Hirschhorn LR, Nisingizwe MP, Kamanzi E, Drobac PC, Rwabukwisi FC, et al. Impact of a district-wide health center strengthening intervention on healthcare utilization in rural Rwanda: Use of interrupted time series analysis. PLoS ONE. 2017;12 10.1371/journal.pone.0182418 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.National Institute of Statistics of Rwanda (NISR) [Rwanda], Ministry of Health (MOH) [Rwanda], ICF International. Rwanda Demographic and Health Survey 2014–15. Rockville, Maryland, USA: NISR, MOH and ICF International; 2015. [Google Scholar]
  • 33.Day LT, Ruysen H, Gordeev VS, Gore-Langton GR, Boggs D, Cousens S, et al. “Every Newborn-BIRTH” protocol: observational study validating indicators for coverage and quality of maternal and newborn health care in Bangladesh, Nepal and Tanzania. Journal of Global Health. 2019;9 10.7189/jogh.09.010902 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Magge H, Chilengi R, Jackson EF, Wagenaar BH, Kante AM, AHI PHIT Partnership Collaborative. Tackling the hard problems: implementation experience and lessons learned in newborn health from African Health Initiative. BMC Health Services Research. 2017;17 10.1186/s12913-017-2659-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.World Health Organization. Standards for improving quality of maternal and newborn care in health facilities. Geneva: World Health Organization; 2016. [Google Scholar]
  • 36.Rwanda Ministry of Health. Fourth Health Sector Strategic Plan. Kigali, Rwanda: Republic of Rwanda; 2018. [Google Scholar]
  • 37.Rwanda Ministry of Health. Health Service Packages for Public Health Facilities. 2017.
  • 38.Rwanda Ministry of Health. Standard Operating Procedures for Management of Routine Health Information at Referral/Provincial and District Hospitals (Public and Privates). Rwanda Ministry of Health; 2019. [Google Scholar]
  • 39.Rwanda Ministry of Health. Standard Operating Procedures for Management of Routine Health Information at Health Centers/Posts/Private Health Facilities. Rwanda Ministry of Health; 2019. [Google Scholar]
  • 40.R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2017. Available: https://www.R-project.org/ [Google Scholar]
  • 41.Ethiopian Public Health Institute, Federal Ministry of Health, World Health Organization. Ethiopia Health Data Quality Review: System Assessment and Data Verification for Selected Indicators. Addis Ababa, Ethiopia: Ethiopian Public Health Institute; 2016. [Google Scholar]
  • 42.Bhattacharya AA, Umar N, Audu A, Felix H, Allen E, Schellenberg JRM, et al. Quality of routine facility data for monitoring priority maternal and newborn indicators in DHIS2: A case study from Gombe State, Nigeria. Bazzano AN, editor. PLOS ONE. 2019;14: e0211265 10.1371/journal.pone.0211265 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Nicol E, Dudley L, Bradshaw D. Assessing the quality of routine data for the prevention of mother-to-child transmission of HIV: An analytical observational study in two health districts with high HIV prevalence in South Africa. International Journal of Medical Informatics. 2016;95: 60–70. 10.1016/j.ijmedinf.2016.09.006 [DOI] [PubMed] [Google Scholar]
  • 44.Gergen J, Josephson E, Coe M, Ski S, Madhavan S, Bauhoff S. Quality of Care in Performance-Based Financing: How It Is Incorporated in 32 Programs Across 28 Countries. Global Health: Science and Practice. 2017;5: 90–107. 10.9745/GHSP-D-16-00239 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Kanyangarara M, Munos MK, Walker N. Quality of antenatal care service provision in health facilities across sub-Saharan Africa: Evidence from nationally representative health facility assessments. Journal of Global Health. 2017;7 10.7189/jogh.07.021101 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Hodgins S, D’Agostino A. The quality-coverage gap in antenatal care: toward better measurement of effective coverage. Global Health: Science and Practice. 2014;2 10.9745/GHSP-D-13-00176 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Newman L, Kamb M, Hawkes S, Gomez G, Say L, Seuc A, et al. Global Estimates of Syphilis in pregnancy and Associated Adverse Outcomes: Analysis of Multinational Antenatal Surveillance Data. PLoS Medicine. 2013;10 10.1371/journal.pmed.1001396 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Black RE, Victoria CG, Walker SP, Bhutta ZA, Christian P, de Onis M, et al. Maternal and child undernutrition and overweight in low-income and middle-income countries. Lancet. 2013; 427–51. [DOI] [PubMed] [Google Scholar]

Decision Letter 0

Rakhi Dandona

3 Feb 2020

PONE-D-19-35132

Rwanda Health Management Information System (HMIS) data verification: A case of seventy-six health facilities in four districts of Rwanda

PLOS ONE

Dear Mr. Nshimyiryo,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.  Please address the methodology and analysis related concerns raised by the two reviewers in much detail.

We would appreciate receiving your revised manuscript by Mar 19 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Rakhi Dandona

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

http://www.journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and http://www.journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

Additional Editor Comments (if provided):

1. The methods section needs more detail as highlighted by both the reviewers. Please also provide information on the level and extent of completeness and missingness of the HMIS variables and the data sources.

2. Please provide details of how the verification factor was calculated at the HCA level. It refers to the “hospital catchment area”, but no details are provided about from where the catchment data were sought.

3. Internal consistency between the HMIS variables and data sources can be assessed, and it will add value to this paper.

4. Overall, the study shows reasonable quality of HMIS other than for ANC, which seems a bit unexpected as the assumption was for this to be of poor quality. Discussion does not highlight this positive aspect of the study findings, and focusses mainly on the two ANC variables which were found of a lesser quality.

5. It will also be useful for the authors to comment on the extent of identifying unique pregnant women across the continuum of care within the current HMIS.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Overall comments:

This study presents an analysis of the routine data quality dimension of internal consistency for a set of maternal and newborn care data elements, using the WHO verification factor metric to demonstrate the accuracy of facility reporting. This paper is a helpful contribution to the literature for understanding an important aspect of the quality of routine data in HMIS to monitor maternal and neonatal care and extends the evidence of previous studies on completeness and internal consistency for Rwanda. I hope the authors will find the comments helpful as they revise the paper.

Major comments:

Introduction: Overall comment that the rationale and literature review could be strengthened.

Introduction, paragraph 2: Aside from introducing the four data quality dimensions of the WHO data quality review toolkit for facility data, the purpose of the remainder of the paragraph is unclear:

• There is an assertion that the WHO guidelines, including the VF metric, are limited in implementation given that the guidelines have been out for over a decade. Then the next sentence noted 5 studies which used the VF. In the literature, there are more than the five studies cited that used the WHO VF by name, and even more studies that use a percentage or ratio to relate the counts of data in facility records to the reported facility data. Accuracy of reporting along with completeness are the most analysed dimensions of data quality (please refer to systematic reviews on health data quality for both high and low-middle income countries).

• In this paragraph and the next paragraph, there is an emphasis on the WHO verification factor metric definition, 0.90<vf<1.10, way.="">

Introduction, paragraph 2, first sentence “The World Health Organization…coverage rates”: Please note that the four dimensions named here are from the ‘WHO data quality health facility data quality report card’ and not from the ‘WHO data quality review: a toolkit for facility data quality assessment’ which frames the dimensions slightly differently. Please update the dimensions according to your preferred reference.

Methods: Please include a brief description for why these indicators were selected (a sentence or two).

Methods: As this metric assesses the accuracy of facility reporting, please describe how the data is captured, summarized, reported, and subsequently entered into DHIS 2.

Results, paragraph 1, first sentence “The proportion of HFs … iron/folic acid (87%; 60/69)”: This sentence provides important information on completeness of data that is difficult to readily calculate from Table 2. It puts the VF results in context. Consider including a column in Table 2, after the data element, that notes the completeness of the data as a proportion of the facilities that are providing that service.

Results, figure 1: Excellent figure. Consider adding a legend which reminds the reader which directions represent under/overreporting.

Table 4, “rare data elements”: I wouldn’t call these rare data elements as they are regularly reported into HMIS. Perhaps “rare events” or “rare outcomes”?

Discussion: Overall comment that the literature review should be updated to reflect where the current study fits in.

Discussion, paragraph 2, 3rd sentence “These quality of Rwanda HMIS data… same data in other Sub-Sahara African countries”: The sentence notes “countries” but references only one study in one country.

Discussion, paragraph 2, 6th sentence “This is a different finding to that found by other studies…by geographical location of reporting HFs”: Again, the sentence notes “other studies” but references only one study.

Discussion, paragraph 3: Please reflect more on the poor level of agreement, as there are notable directions in the level of agreements based on data element which are not elaborated – which indicators under/overreport and potential reasons.

Discussion, paragraph 3, 2nd sentence “The accuracy of reporting on these data elements might be dependent on the health workers knowledge of how to calculate the pregnancy’s gestational age in weeks and how to correctly schedule the ANC standard visits, as well as the availability of tools, mainly pregnancy wheels, that facilitate these calculations.” While this may be true in terms of the validity of the documentation, the data verification exercise assesses the ability of the health care worker to report as expected based on the source documents. The external research team, whose data collection is being used as the reference standard for the recounted data, does not also measure the women for gestational age, schedule the appointments, etc. They are looking at the same data source, as the health facility staff would, for determining which numbers would be counted as ANC1_standard versus ANC new registrants.

Reviewer #2: This a relevant paper as it addresses a crucial aspect of Health Management Information System (HMIS) data that is data quality assessment. HMIS are promising source data but data quality remains challenging in Sub-Saharan Africa countries including Rwanda. Unlike to most of the studies focusing on desk review of routine health information system data, this study verifies facility source documents and the level of agreement with national records.

However, it is a pity that the study is limited to only 76 districts within 4 districts that are non-representative of the entire Rwanda. In this is regard, it may be valuable to provide a brief context description of Rwanda heath system and health information system including the number of health facilities and districts in the country, the public and private sector, HMIS data collection process and data processing as that may impact the level of agreement.

Suggestion to revise the title that is a bit long and to make it more attractive (e.g. "Health Management Information System (HMIS) data verification: A case study in four districts in Rwanda"

In the abstract, good to clearly present the objective of the study, as this is not obvious, as well as improving the results and conclusion sections.

My main comments are related to the objective of the study, analysis performed and the discussions of the results. I agree that the verification of level of agreement between HMIS data and facility source documents data is a relevant objective. However, I do think that you may be able to go beyond this objective and tackle additional analysis. You may for instance carry out additional analysis like completeness of reporting and internal consistency for both HMIS data and facility source documents in order to highlight the impact of the lack of agreement between both sources on data quality. I wonder whether that is possible based on available data or at this stage, but it may be worthwhile to assess the impact of data accuracy (level of agreement) on data quality (e.g. completeness of reporting or internal consistency). A question can be to know whether districts with good level of agreement report data with better quality. Since you stated that (p.18) "this verification showed high level of agreement between data reported to HMIS and records in facility source documents for the number of ANC1, deliveries and live births", it may be interesting to assess data quality in districts with good agreement versus districts with low agreement.

To assess whether the source of data matter, it may be interesting to provide, even as an appendix, a table presenting the verification factor by source of data (ANC register, maternity register, PNC register, NCU register), and address a bit that in the analysis.

The discussion section needs to be deeply revisited to reflect more the expectations from a classical discussion section, discuss more the discrepancies between both sources, relevant factors (e.g. unmotivated, poor trained or overworked health personnel, disinterest for health data, etc., problem of equipment, potential issues regarding the transfer of data, data entry errors, etc.), implications of the results.

As an explication you stated (p. 18-19) "However, there was poor quality of HMIS data on the number of ANC1 and ANC4 standard visits. The accuracy of reporting on these data elements might be dependent on the health care provider’s knowledge of how to calculate the pregnancy’s gestational age in weeks and how to correctly schedule the ANC standard visits, as well as the availability of tools, mainly pregnancy wheels, that facilitate these calculations". Through this explanation, you are trying to address the reasons related to the true accuracy of the source documents, instead of providing relevant reasons explaining the lack of agreement between HMIS data and facility source documents. Suggestions to provide relevant reasons/explanations and in line with findings.

Row 233; Check the median VF for syphilis as it does not with data in Table 2.

Table 3: Write properly the label of data elements.

 </vf<1.10,>

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Jul 17;15(7):e0235823. doi: 10.1371/journal.pone.0235823.r002

Author response to Decision Letter 0


19 Mar 2020

Dear Academic Editor of PLOS ONE,

Thank you for the opportunity to revise our manuscript to meet PLOS ONE’s publication criteria. We have addressed each reviewer’s comments and points raised by Editor in this revised version of the manuscript.

Editor’s comments:

1)The methods section needs more detail as highlighted by both the reviewers. Please also provide information on the level and extent of completeness and missingness of the HMIS variables and the data sources.

� Thank you. We have now added two columns to Table 2 to report the proportion of health facilities that provide each service and had completed source documents’ data and HMIS data for the reporting period. We also added a note indicating that all reporting health facilities had complete HMIS data.

2) Please provide details of how the verification factor was calculated at the HCA level. It refers to the “hospital catchment area”, but no details are provided about from where the catchment data were sought.

� We have now provided more details on the process of calculating a verification factor at the HCA, where for each data element, all non-missing values were summed for all the health facilities under that HCA during the study period. See page 11, line 211-217, which reads as “At the HCA level, a VF was calculated by summing all the non-missing values for each data element and all the reporting HFs under that HCA during the study period. Then, a HCA-level VF was calculated as a ratio of the aggregated recounted data to HMIS data. In addition, a VF for data elements with rare events was only calculated at the HCA level, where aggregated data were compared to avoid denominators with true zero values that would be expected if these data were compared at the HF level”.

3) Internal consistency between the HMIS variables and data sources can be assessed, and it will add value to this paper.

�Thank you for this comment. We might have assessed the internal consistency using methods described in the World Health Organization (WHO) data quality assessment guidelines, however a short study period (3 months) was a limitation to do so. The WHO guidelines on HMIS data quality assessment (DQA) recommend a minimum of 12 months’ data in order to assess the internal consistency between related indicators. The number of women with ANC4 standard visit would be expected to be a portion of those who had an ANC1 standard visit in the past and the ratio between them to be 1 or below, but the fact that the ANC4 is a cumulative variable as those who completed the four standard visits during the study period were those who had their first standard ANC1 in six months before. Therefore, as our study period was just 3 months, it’s more likely that we may falsely claim inconsistency or consistency between ANC1 standard and ANC4 standard visits data elements. Using the data for 3 months, the level of consistency between ANC1 standard and ANC4 standard was quite similar for both HMIS and recounted data from source documents. We have shared this here, but not included in the paper due to the limitations noted.

Hospital catchment area (HCA, n*) Health facilities with internal consistency between indicators (ANC4/ANC1≤1) for recounted data from source documents

n (%) Health facilities with internal consistency between indicators (ANC4/ANC1≤1) for HMIS data

n (%)

Nemba DH (n=13) 11 (84.6) 8 (61.5)

Ruli DH (n=9) 5 (55.6) 7 (77.8)

Kinihira PH (n=8) 5 (62.5) 5 (62.5)

Rutongo DH (n=13) 12 (92.3) 12 (92.3)

Gakoma DH (n=5) 4 (80.0) 4 (80.0)

Kibilizi DH (n=9) 4 (44.4) 5 (55.6)

Mibilizi DH (n=11) 8 (72.7) 7 (63.6)

Overall (n=68) 49 (72.1) 48 (70.6)

Note: n*, is the number of reporting health facilities in each hospital catchment area (HCA)

4) Overall, the study shows reasonable quality of HMIS other than for ANC, which seems a bit unexpected as the assumption was for this to be of poor quality. Discussion does not highlight this positive aspect of the study findings, and focusses mainly on the two ANC variables which were found of a lesser quality.

�Thank you for this comment. We have now highlighted the high quality of the Rwanda HMIS data for some data elements related to maternal and newborn health by comparing our findings to what was found by other HMIS data verification studies. This now reads as: “Notably, this verification showed high level of agreement between data reported to HMIS and records in facility source documents for the number of ANC1, deliveries and live births. These data elements are among the WHO recommended core indicators for DQR in HMIS data on maternal health17. The quality of Rwanda HMIS data on these data elements is higher than what was found in HMIS data verifications for the same data in Ethiopia40 and Nigeria41 and similar for the ANC1 indicator in Malawi22.” (see page 21, lines 341-346).

5) It will also be useful for the authors to comment on the extent of identifying unique pregnant women across the continuum of care within the current HMIS.

� Thank you. We have added to the “Methods section/Study setting” the information on the extent of identifying unique pregnant women across the continuum of care within the Rwanda HMIS (see page 9, lines 172-175). The text reads as follows: “Data are recorded using standardized registers developed by the MoH and provided to all HFs (see Table 1 for data sources); women attending ANC are recorded at their first ANC visit and provided an ANC card and an ANC number that facilitates continuity of data recording at the individual level for ANC.”

Reviewer #1’s comments:

Overall comments: This study presents an analysis of the routine data quality dimension of internal consistency for a set of maternal and newborn care data elements, using the WHO verification factor metric to demonstrate the accuracy of facility reporting. This paper is a helpful contribution to the literature for understanding an important aspect of the quality of routine data in HMIS to monitor maternal and neonatal care and extends the evidence of previous studies on completeness and internal consistency for Rwanda. I hope the authors will find the comments helpful as they revise the paper.

� Thank you.

Major comments:

1) Introduction: Overall comment that the rationale and literature review could be strengthened.

• Introduction, paragraph 2: Aside from introducing the four data quality dimensions of the WHO data quality review toolkit for facility data, the purpose of the remainder of the paragraph is unclear:

o There is an assertion that the WHO guidelines, including the VF metric, are limited in implementation given that the guidelines have been out for over a decade. Then the next sentence noted 5 studies which used the VF. In the literature, there are more than the five studies cited that used the WHO VF by name, and even more studies that use a percentage or ratio to relate the counts of data in facility records to the reported facility data. Accuracy of reporting along with completeness are the most analysed dimensions of data quality (please refer to systematic reviews on health data quality for both high and low-middle income countries). In this paragraph and the next paragraph, there is an emphasis on the WHO verification factor metric definition, 0.90

� Thank you very much for this comment. We have revised the paragraph on pages 4-5 (lines 85-97) and now reads as follows: “The WHO toolkit for data quality review defines a verification factor (VF) as the ratio of recounted data from facility source documents over HMIS data17. The application of the VF has been more limited and using variable definitions. However, there is evidence that the level of agreement between HMIS data and records in facility source documents can vary depending on the type of data being collected and be rooted in early stages of collecting those data from facility source documents11,19–24. Over- or under-reporting in HMIS data can result from human errors that occur when counting events from source documents or simply not including all events for the reporting period – by omitting some of the necessary source documents or not covering the entire reporting period11,24. In addition, intentional over- or under-reporting in HMIS data at the facility level can be motivated by the pressure to meet national targets, whereas, inaccuracies in transferring data from facility source documents to the electronic database can also be associated with excessive workload of staff combined with the pressure to meet reporting deadlines23”. We also included a systematic review paper on HMIS data verification for additional notes on barriers to data quality (Wetherill et al., 2017).

• Introduction, paragraph 2, first sentence “The World Health Organization…coverage rates”: Please note that the four dimensions named here are from the ‘WHO data quality health facility data quality report card’ and not from the ‘WHO data quality review: a toolkit for facility data quality assessment’ which frames the dimensions slightly differently. Please update the dimensions according to your preferred reference.

� Thank you. We have now updated the four dimensions for HMIS data quality review through a desk review according to the 2017 World Health Organization guidelines on HMIS data quality review (see page 4, lines 80-83). This reads as: “The desk review assesses HMIS data quality in four dimensions: 1) completeness and timeliness of data, 2) internal consistency of reported data, 3) external consistency; and 4) external comparisons of population data”.

2. Methods:

• Please include a brief description for why these indicators were selected (a sentence or two).

� Thank you. We have clarified in the introduction the reason why this study included 14 HMIS data elements related to maternal and newborn health care. It reads as follows: “We calculate VFs for fourteen HMIS data elements that were identified jointly by PIH/IMB and MoH as priority indicators for quality improvement in maternal and newborn health care for 76 HFs (7 hospitals and 69 health centers) that received the ABC intervention between 2017 and 2019 in four districts of Rwanda. The criteria for indicator selection included having clinical relevance to neonatal survival, government priority indicators, and/or being indicators within the World Health Organization standards for improving quality in maternal and newborn care in health facilities” (see page 6, lines 129-135).

• As this metric assesses the accuracy of facility reporting, please describe how the data is captured, summarized, reported, and subsequently entered into DHIS 2.

� Thank you. We have now described the sources of data in details under the methods section (see page 10, lines 180-190), and it reads as follows: “The HMIS data collection starts from the reporting facility, with clinical staff in each care service registering patients/clients and the care provided to them in standardized registers and/or medical files37,38. Then, for monthly reporting to HMIS, the facility data manager ensures the distribution of paper HMIS reporting forms to heads of services by the 25th of each month. The head of service collects those data that are relevant to their specific service and submits a completed HMIS report for the previous month to the facility data manager by the 3rd day of the month following the month of reporting. For timely reporting, the facility data manager should upload all facility data into DHIS 2 by the 5th day of every month. Data verification by facility team and corrections in the system are only allowed between the 5th and 15th of each month. Any request for changes on the data in the system beyond the 15th of each month should be submitted to the central MoH, and access is only granted upon strong justification of the request”.

3. Results:

• Results, paragraph 1, first sentence “The proportion of HFs … iron/folic acid (87%; 60/69)”: This sentence provides important information on completeness of data that is difficult to readily calculate from Table 2. It puts the VF results in context. Consider including a column in Table 2, after the data element, that notes the completeness of the data as a proportion of the facilities that are providing that service.

� Thank you for this comment. We have now added two columns to Table 2 to report the proportions of health facilities that provide each service and had completed source documents’ data and HMIS data for the reporting period.

• Results, figure 1: Excellent figure. Consider adding a legend which reminds the reader which directions represent under/overreporting.

� We have now added to Figure 1 a legend for over-reporting and under-reporting, as well as labels for data elements (see Fig 1).

• Table 4, “rare data elements”: I wouldn’t call these rare data elements as they are regularly reported into HMIS. Perhaps “rare events” or “rare outcomes”?

� We have changed the name to “Data elements with rare events” (see Tables 1 and 4)

4) Discussion: Overall comment that the literature review should be updated to reflect where the current study fits in.

• Discussion, paragraph 2, 3rd sentence “These quality of Rwanda HMIS data… same data in other Sub-Sahara African countries”: The sentence notes “countries” but references only one study in one country.

� Thank you for your comment. We have now rephrased the sentence and cited 3 studies in Ethiopia, Nigeria and Malawi. The text reads as follows: “The quality of Rwanda HMIS data on these data elements is higher than what was found in HMIS data verifications for the same data in Ethiopia40 and Nigeria41 and similar for the ANC1 indicator in Malawi22” (see page 21, lines 344-346).

• Discussion, paragraph 2, 6th sentence “This is a different finding to that found by other studies…by geographical location of reporting HFs”: Again, the sentence notes “other studies” but references only one study.

� We have now cited two studies in South Africa and Ethiopia (see page 21, line 351).

• Discussion, paragraph 3: Please reflect more on the poor level of agreement, as there are notable directions in the level of agreements based on data element which are not elaborated – which indicators under/overreport and potential reasons. Discussion, paragraph 3, 2nd sentence “The accuracy of reporting on these data elements might be dependent on the health workers knowledge of how to calculate the pregnancy’s gestational age in weeks and how to correctly schedule the ANC standard visits, as well as the availability of tools, mainly pregnancy wheels, that facilitate these calculations.” While this may be true in terms of the validity of the documentation, the data verification exercise assesses the ability of the health care worker to report as expected based on the source documents. The external research team, whose data collection is being used as the reference standard for the recounted data, does not also measure the women for gestational age, schedule the appointments, etc. They are looking at the same data source, as the health facility staff would, for determining which numbers would be counted as ANC1_standard versus ANC new registrants.

� Thank you so much for your comments. We have now clarified the link between the quality of HMIS data on these data elements and the knowledge of calculating pregnancy’s gestational age and availability of tools like pregnancy wheels that facilitate the calculations. The added text reads as follows: “It was observed by the study data collection team, that data in registers were recorded in ways that were not compatible with HMIS reporting, for example – gestational age at first visit recorded in the register in months and then reported into HMIS based on cutoffs in weeks which may contribute to challenges in reporting incompatible information out of the source documents into HMIS. In addition, an analysis of the ABC baseline data on the availability of essential medical equipment and supplies at facilities that received the ABC intervention, revealed that only 44.9% (31/69) of health centers were having a pregnancy wheel in the pre-intervention period” (see page 22, lines 362-369). We also highlighted in the “Methods/recounted data from facility source documents” that the recording of gestational age in units that were not compatible with the HMIS reporting on ANC1 and ANC4 standard visits data elements required the data collection team to recalculate the gestational age at each ANC visit. The text reads as follows: “In particular, due to observed variable unit of recording of gestational age (GA) in weeks or months in the ANC register by facility and care provider, ABC data collectors worked with midwives or nurses responsible for providing ANC to standardize the calculation of GA in weeks before recounting data on ANC1 and ANC4 standard visits, as the reporting to HMIS on these data elements is based on GA calculated in weeks. The data collection team used a pregnancy wheel and the data recorded on date of last menstrual period and dates of ANC visits for individual pregnant women who attended ANC to determine GA at each visit” (see page 10-11, lines 197-204).

Reviewer #2’s comments:

This a relevant paper as it addresses a crucial aspect of Health Management Information System (HMIS) data that is data quality assessment. HMIS are promising source data but data quality remains challenging in Sub-Saharan Africa countries including Rwanda. Unlike to most of the studies focusing on desk review of routine health information system data, this study verifies facility source documents and the level of agreement with national records.

� Thank you.

1) However, it is a pity that the study is limited to only 76 districts within 4 districts that are non-representative of the entire Rwanda. In this is regard, it may be valuable to provide a brief context description of Rwanda heath system and health information system including the number of health facilities and districts in the country, the public and private sector, HMIS data collection process and data processing as that may impact the level of agreement.

� Thank you. We have now added information on the number of districts and health facilities in Rwanda (see page 9, lines 157-159): “These MoH-operated facilities were included because they received the ABC intervention, and represented 14% (69/499) of health centers and 15% (7/48) of the hospitals in all 30 districts in Rwanda35”. We have also added detailed information on the HMIS data collection process under the methods section (see page 10, lines 180-190) which was also raised as a comment by Reviewer 1.

2) Suggestion to revise the title that is a bit long and to make it more attractive (e.g. "Health Management Information System (HMIS) data verification: A case study in four districts in Rwanda"

� The title now reads as follows: “Health management information system (HMIS) data verification: a case study in four districts in Rwanda”

3) In the abstract, good to clearly present the objective of the study, as this is not obvious, as well as improving the results and conclusion sections.

� Thank you. We have now clearly stated the objective of our study in the introductory paragraph of the abstract and reads as follows: “We aimed to review the quality of Rwandan HMIS data for maternal and newborn health (MNH) based on consistency of HMIS reports with facility source documents” (see page 2, lines 26-28).

We have also revised the abstract’s conclusion and reads as follows: “There was variable HMIS data quality by data element, with some indicators with high quality and also consistency in reporting trends across districts. Over-reporting was observed for ANC-related data requiring more complex calculations, i.e., knowledge of gestational age, scheduling to determine ANC standard visits, as well as quality indicators in ANC. Ongoing data quality assessments and training to address gaps could help improve HMIS data quality” (see page 3, lines 46-50).

4) My main comments are related to the objective of the study, analysis performed and the discussions of the results.

• I agree that the verification of level of agreement between HMIS data and facility source documents data is a relevant objective. However, I do think that you may be able to go beyond this objective and tackle additional analysis. You may for instance carry out additional analysis like completeness of reporting and internal consistency for both HMIS data and facility source documents in order to highlight the impact of the lack of agreement between both sources on data quality. I wonder whether that is possible based on available data or at this stage, but it may be worthwhile to assess the impact of data accuracy (level of agreement) on data quality (e.g. completeness of reporting or internal consistency). A question can be to know whether districts with good level of agreement report data with better quality. Since you stated that (p.18) "this verification showed high level of agreement between data reported to HMIS and records in facility source documents for the number of ANC1, deliveries and live births", it may be interesting to assess data quality in districts with good agreement versus districts with low agreement.

� Thank you. It would have been important to assess the effect of data completeness and consistency between related indicators on the level of agreement between HMIS data and records in facility source documents, however this is not possible for this study, as we only included in our analysis for HMIS data verification, facilities with complete HMIS and source documents for assessed data elements and the study period. In addition, we also have limitations to assess the consistency between related data elements as explained earlier in this reply letter that the consistency assessment with just a three-month reporting period would be subjected to wrong conclusions.

• To assess whether the source of data matter, it may be interesting to provide, even as an appendix, a table presenting the verification factor by source of data (ANC register, maternity register, PNC register, NCU register), and address a bit that in the analysis.

� Thank you. We also think that the format of a source document (register) itself matters in reporting. However, we were not able to calculate a verification factor by source document, as one register was a source to multiple data elements that we assessed and it was not possible to aggregate them to do so. But, with the calculation of a verification factor by data element, we hope that this was indirectly captured. We have referenced in Table 1 the data source along with the description of each data element extracted. We have described the data source for each data element (see page 7, lines 142-147) and added a note in the methods (page 9, lines 172-173) when talking about the HMIS data sources to refer the reader to this table as follows: “Data are recorded using standardized registers developed by the MoH and provided to all HFs (see Table 1 for data sources)”.

5) Discussion:

• The discussion section needs to be deeply revisited to reflect more the expectations from a classical discussion section, discuss more the discrepancies between both sources, relevant factors (e.g. unmotivated, poor trained or overworked health personnel, disinterest for health data, etc., problem of equipment, potential issues regarding the transfer of data, data entry errors, etc.), implications of the results.

� Thank you for your comment. We have included additional context in the discussion to better contextualize the results. We have added additional literature review that highlights some of the strengths of the data and compares our results to what has been seen in other countries, as raised by reviewer 1. In addition, we added a reference to a systematic review on data quality of immunization data, which identified key factors contributing to poor data quality. This revised paragraph on page 22, lines 356-359, now reads: “However, there was poor quality of HMIS data on the number of ANC1 and ANC4 standard visits with a general trend of over-reporting. A systematic review of immunization data quality identified insufficient human resources and limited healthcare worker capacity for reporting and using data as key issues that contribute to poor data quality23.”

• As an explication you stated (p. 18-19) "However, there was poor quality of HMIS data on the number of ANC1 and ANC4 standard visits. The accuracy of reporting on these data elements might be dependent on the health care provider’s knowledge of how to calculate the pregnancy’s gestational age in weeks and how to correctly schedule the ANC standard visits, as well as the availability of tools, mainly pregnancy wheels, that facilitate these calculations". Through this explanation, you are trying to address the reasons related to the true accuracy of the source documents, instead of providing relevant reasons explaining the lack of agreement between HMIS data and facility source documents. Suggestions to provide relevant reasons/explanations and in line with findings.

� Thank you for your comment. We have now clarified the link between the quality of HMIS data on these data elements and the knowledge of calculating pregnancy’s gestational age and availability of tools like pregnancy wheels that facilitate the calculations. The added text reads as follows: “It was observed by the study data collection team, that data in registers were recorded in ways that were not compatible with HMIS reporting, for example – gestational age at first visit recorded in the register in months and then reported into HMIS based on cutoffs in weeks which may contribute to challenges in reporting incompatible information out of the source documents into HMIS. In addition, an analysis of the ABC baseline data on the availability of essential medical equipment and supplies at facilities that received the ABC intervention, revealed that only 44.9% (31/69) of health centers were having a pregnancy wheel in the pre-intervention period” (see page 22, lines 362-369). We also highlighted in the “Methods/recounted data from facility source documents” that the recording of gestational age in units that were not compatible with the HMIS reporting on ANC1 and ANC4 standard visits data elements required the data collection team to recalculate the gestational age at each ANC visit. The text reads as follows: “In particular, due to observed variable unit of recording of gestational age (GA) in weeks or months in the ANC register by facility and care provider, ABC data collectors worked with midwives or nurses responsible for providing ANC to standardize the calculation of GA in weeks before recounting data on ANC1 and ANC4 standard visits, as the reporting to HMIS on these data elements is based on GA calculated in weeks. The data collection team used a pregnancy wheel and the data recorded on date of last menstrual period and dates of ANC visits for individual pregnant women who attended ANC to determine GA at each visit” (see page 10-11, lines 197-204).

6) Row 233; Check the median VF for syphilis as it does not with data in Table 2.

� Thank you. We have now corrected the numbers for the median and interquartile range in the text (see page 16, lines 274-275).

7) Table 3: Write properly the label of data elements.

� Thank you. We have now updated labels of data elements in all tables to avoid abbreviations (see Tables 1, 2, 3 and 4).

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 1

Dejing Dou

24 Jun 2020

Health management information system (HMIS) data verification: a case study in four districts in Rwanda

PONE-D-19-35132R1

Dear Dr. Nshimyiryo,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Dejing Dou, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: All comments have been addressed

Reviewer #3: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: Yes

Reviewer #3: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: Yes

Reviewer #3: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: Yes

Reviewer #3: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: (No Response)

Reviewer #3: This paper addresses the data quality assessment problem of HMIS by studying the level of agreement between HMIS data and records in facility source documents. The revised version addresses the comments and is well written.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: Yes: Abdoulaye Maïga, PhD

Reviewer #3: No

Acceptance letter

Dejing Dou

6 Jul 2020

PONE-D-19-35132R1

Health management information system (HMIS) data verification: a case study in four districts in Rwanda

Dear Dr. Nshimyiryo:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Professor Dejing Dou

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Dataset

    (DTA)

    Attachment

    Submitted filename: Response to Reviewers.docx

    Data Availability Statement

    All relevant data are within the manuscript and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES