Abstract
Objectives: Routine health information systems (RHISs) provide data that are vital for planning and monitoring individual health. Data from RHISs could also be used for purposes for which they were not originally intended, provided that the data are of sufficient quality. For example, morbidity data could be used to inform burden of disease estimations, which serve as important evidence to prioritize interventions and promote health. The objective of this study was to identify and assess published quantitative assessments of data quality related to patient morbidity in RHISs in use in South Africa.
Materials and Methods: We conducted a review of literature published between 1994 and 2014 that assessed the quality of data in RHISs in South Africa. World Health Organization (WHO) data quality components were used as the assessment criteria.
Results Of 420 references identified, 11 studies met the inclusion criteria. The studies were limited to tuberculosis and HIV. No study reported more than 3 WHO data quality components or provided a quantitative assessment of quality that could be used for burden of disease estimation.
Discussion: The included studies had limited geographical focus and evaluated different source data at different levels of the information system. All studies reported poor data quality.
Conclusion: This review confirmed concerns about the quality of data in RHISs, and highlighted the need for a comprehensive evaluation of the quality of patient-level morbidity data in RHISs in South Africa.
Keywords: routine health information system, data quality, surveillance, morbidity, South Africa
BACKGROUND AND SIGNIFICANCE
South Africa is an upper-middle-income developing country1 with a fragmented health care system.2 The country is moving toward a national health insurance system to improve access to health care services for all South Africans.3 The national health insurance system will be dependent on effective information systems to guide its operations and expenditure.3 Improvements in the quality, coverage, and standardization of public and private health information systems are pivotal to their successful implementation.3
Enhancing the use of health information and improving data quality are essential to scale up health service delivery.4–7 Governments and nongovernmental organizations spend significant resources on routine health information systems,4 and it is important to determine whether these investments result in improved data systems, enhanced program management, and better data quality.
An RHIS is defined as “a system that provides information at regular intervals of a year or less through mechanisms designed to meet predictable information needs. This includes paper-based or electronic health records, and facility- and district-level management information systems.”4 Data that are reliable, accurate, complete, and accessible in a timely manner make up one of the building blocks of a good RHIS.8
The public sector in South Africa uses several well-established RHISs. Systems such as the District Health Information System (DHIS), the Electronic TB Register (ETR.Net), the Electronic Register for Drug-Resistant TB (EDRWeb.Net), and the Electronic Register for ARV treatment (TIER.Net) have been implemented nationally.9 A pathology-based national cancer register was established in 1986,10 but the last published incidence report was for 2004.11 Several other RHISs are used, but often only implemented in a single province, such as eKapa (ART and TB monitoring system) (Western Cape) and Nootroclin (Northern Cape).9
Data quality issues of the DHIS at facility, district, and provincial levels have been raised in information audits by the auditor general of South Africa.12 The need to find quantitative studies of data quality was identified because it is often reported that “the quality of data in RHISs in South Africa is poor,”3,13–16 but the extent of the problem cannot be quantified on the basis of qualitative data. Qualitative data can describe the nature of the problem and can also provide possible reasons why the problem exists. A quantitative assessment of the quality of data in RHISs gives an idea of the extent of the problem, and would ideally provide a numerical factor that could be used to adjust routine data to inform planning or burden of disease analysis.
In South Africa, routine mortality data from civil registration and injury surveillance are used to inform burden of disease studies17,18 and contribute to global burden of disease studies,19 but it is unclear whether routine morbidity data could be used in a similar way. Although RHISs are designed to support national and decentralized decision-making and health service management,4,5,20 they could also provide morbidity data to help estimate the national burden of disease, provided they are of sufficient quality.
OBJECTIVE
In the absence of comprehensive information on the quality of data reported in the RHISs in South Africa, a literature review was conducted to identify and assess published studies that report on quantitative assessment of the quality of data in RHISs, with a focus on morbidity data.
MATERIALS AND METHODS
Data quality
Multiple frameworks for analyzing RHISs have been developed.8,21–26 Many of these frameworks overlap with regard to components that make up data quality. As the focus of this study is to review reported RHIS data quality in already published studies,7,27 the WHO framework8 was found to be appropriate.
The WHO has described the components of data quality in a guide aimed at improving data quality in developing countries (Table 1).8 These commonly used criteria for the evaluation of data quality28–32 have been adopted as the framework for this review. Furthermore, they are largely consistent with the components stated in the standard operating procedure for the District Health Management Information System by the National Department of Health, which include accuracy, reliability, completeness, timeliness, and accessibility.33 This standard operating procedure also includes integrity, coherence, and compatibility. Of the 8 data quality components identified by WHO, “accuracy and validity,” “reliability,” and “completeness” were found to be the most relevant for this study.
Table 1.
Description of WHO data quality components
Components | Description |
---|---|
Accuracy and validity | The original data must be accurate in order to be useful. If data are not accurate, then wrong impressions and information are being conveyed to the user. Documentation should reflect the event as it actually happened. |
Reliability | Data should yield the same results on repeated collection, processing, storing, and display of information. |
Completeness | All required data should be present and the medical/health record should contain all pertinent documents with complete and appropriate documentation. |
Legibility | All data, whether written, transcribed, or printed, should be readable. |
Timeliness | Information, especially clinical information, should be documented as an event occurs, treatment is performed, or results are noted. Delaying documentation could cause omission of information or recording of errors. |
Accessibility | All necessary data are available when needed for patient care and for all other official purposes. The value of accurately recorded data is lost if it is not accessible. |
Meaning or usefulness/ relevance | Information is pertinent and useful. |
Confidentiality and data security | Public health registries must be governed by the strictest rules of data protection and confidentiality. The data should be used only for purposes related to public health (and not for discrimination or criminalization). |
Review of publications
This study identified English-language publications evaluating South African RHISs from January 1994 to December 2014. Keywords from the systematic review of health data quality management and best practices by Ndabarora et al.35 guided the basic search terms that were adapted for searching the data sources. Basic terms were (“routine health” OR “district health” OR “health management” OR “public health” OR “national health”) AND (“information system*” OR “informat*” OR “data”) AND (“data quality” OR “data accuracy” OR “data quality improvement” OR “data audit”) AND “South Africa.” Table 2 describes the search terms used for the different data sources.
Table 2.
Search terms for review of publications
Data source | Search terms |
---|---|
PubMed | (((((“Routine health” OR “district health” OR “health management” OR “public health” OR “national health”)) AND (“information system*” OR “informat*” OR “data”)) AND (“data quality” OR “data accuracy” OR “data quality improvement” OR “data audit”)) AND “South Africa”) |
Restricted: 1994–2014 | |
Scopus | ALL (“routine health” OR “district health” OR “health management” OR “public health” OR “national health”) AND ALL (“information system*” OR “"informat*” OR “data”) AND ALL (“data quality” OR “data accuracy” OR “data quality improvement” OR “data audit”) AND ALL (South Africa) AND PUBYEAR > 1993 AND PUBYEAR < 2015 |
Restricted: South Africa | |
Web of Science | TOPIC: (“Routine health” OR “district health” OR “health management” OR “public health” OR “national health”) AND TOPIC: (“information system*” OR “informat*” OR “data”) AND TOPIC: (“data quality” OR “data accuracy” OR “data quality improvement” OR “data audit”) AND TOPIC: (South Africa) |
Restricted: 1994–2014 | |
Science Direct | > 1993 and pub-date < 2015 and (((((“routine health” OR “district health” OR “health management” OR “public health” OR “national health”)) AND (“information system*” OR “informat*” OR “data”)) AND (“data quality” OR “data accuracy” OR “data quality improvement” OR “data audit”)) AND “South Africa”) |
Websites were also searched for relevant gray literature, including peer-reviewed conference proceedings. Titles and abstracts of publications were screened by the first author. At least 2 reviewers (the first author and 1 or more co-authors) reviewed each potentially eligible full-text publication for inclusion. Full-text publications that reported on RHIS in South Africa were then assessed by at least 2 reviewers (the first author and 1 or more co-authors). In addition, references of included articles were checked and experts were contacted to identify additional publications.
Inclusion and exclusion criteria
A quantitative assessment of data quality would provide a numerical factor that could be used to adjust routine data to inform planning or burden of disease analysis. Therefore, even though qualitative assessments can provide clues to the reasons for poor data quality, this was not the focus of the review, and thus studies that focused only on qualitative assessments were excluded, as were commentaries and methodologies. If a study did not include any South African RHISs or focused only on mortality data, it was also excluded.
A team of researchers, including RHIS experts, reviewed studies that addressed the identified WHO data quality components.8,34 All included publications were analyzed to confirm whether they contained quantitative information relating to the WHO data quality components. A publication was included if it quantitatively addressed any of the following components: accuracy and validity, reliability, completeness, legibility, timeliness, accessibility, meaning or usefulness, and confidentiality and data security.8,34 If this information was present, the data system, data flow, actual data elements, and any other related data were noted.
RESULTS
As shown in Figure 1, 464 publications were retrieved and 44 duplicates were removed, resulting in 420 publications to be assessed. After applying all the exclusion criteria, 11 publications were identified that reported on a quantitative assessment of data quality in an RHIS in South Africa. The study characteristics are summarized in Table 3 by study period, setting, population, information system(s), morbidity and related data elements, and WHO data quality component addressed.
Figure 1.
Study inclusion flow diagram.
Table 3.
Summary of included studies
Publication | Study period | Setting (province) | Study population | System | Morbidity and related data evaluated | Accuracy/ Validity | Reliability | Completeness |
---|---|---|---|---|---|---|---|---|
A. PMTCT | ||||||||
Mate et al. (2009)36 | Jan to Dec 2007 | eThekwini, Umgungun-dlovu and Ugu Districts in KwaZulu-NatalI | 316 facilities (fixed clinics, mobile clinics, community health centers, hospitals) |
|
Antenatal care client tested HIV positive | X | X | X |
Mphatswe et al. (2012)37 | May to Nov 2008 | eThekwini, Umgungun-dlovu and Ugu Districts in KwaZulu- Natal | 78 facilities (58 antenatal clinics and 20 delivery wards) |
|
Pregnant women who tested positive for HIVa | X | X | X |
B. TB | ||||||||
Marais et al. (2006)38b | Feb 2003 to Oct 2004 | Cape Town, Western Cape |
|
|
|
X | X | |
Heunis et al. (2011)39c | Nov to Dec 2007 | All 5 districts in the Free State | 20 facilities (mobile clinic, fixed clinic, community health center, or large clinic and district hospital) |
|
|
X | ||
Du Preez et al. (2011)40 | Jul 2007 to Jun 2009 | Tygerberg Children’s Hospital, Cape Town, Western Cape | A retrospective cohort of children (≤ 13 years) with culture-confirmed TBd in Tygerberg Children’s Hospital (n = 291) |
|
|
X | ||
Dunbar et al. (2011)41 | Oct 1, 2006 to Mar 31, 2008 | Cape Town, Western Cape | All cases of bacteriologically confirmed TB in 2 communities (2 community clinics and 1 hospital) |
|
|
X | X | |
Dilraj et al. (2013)42 | Fourth quarter 2009 | Ethekwini District in KwaZulu- Natal | 46 high TB burden facilities (type of facility not stated), n = 1036 smear positive patients in ETR.net |
|
|
X | ||
Rose et al. (2013)43 | Jan 1 to Dec 31, 2012 | Cape Town, Western Cape |
|
Clinical cohort data were compared to the electronic register data. Systems: (1) pediatric DR-TB ward register; (2) inpatient folders; (3) discharge summaries; (4) NHLS LIS (where available), (5) EDRWeb.Net |
|
X | X | |
Ebonwu et al. (2013)44 | 2011 | Sizwe Hospital,e Gauteng | All MDR-TB cases in Gauteng at government facilities with laboratory confirmed results in period | A list of all new diagnoses in Gauteng from the NHLS LIS was compared to case files from the MDR-treatment register and patient MDR-TB treatment card. This study reports on diagnosis to treatment initiation. Systems: (1) NHLS LIS, (2) MDR-TB treatment register and MDR-TB patient card |
|
X | ||
Botha et al. (2008)45 | Apr 2004 to Mar 2005 | Stellenbosch Sub-district, Western Cape | 13 primary health care facilities (clinics) |
|
|
X | ||
C. HIV TB Co- infection | ||||||||
Auld et al. (2013)46 | Jan to Aug 2011 | Bitou, Knysna, and George sub-districts in Eden District, Western Cape |
|
|
|
X | X |
aMphatswe et al. (2012) is a smaller, follow-up study of the Mate et al. (2009) study. The same data elements were examined in both studies, although they are named differently. Mate et al. (2009) conducted the study in 3 districts in KwaZulu-Natal that are not stated in the article. They are most likely the same districts used in the Mphatswe et al. (2012) study.
bThe title refers to accuracy. However, the study reports on accuracy of diagnosis and not of data.
cThis study does not give a measure of whether cases were missing at the provincial level.
dTB cases were classified as pulmonary TB (PTB), including hilar and mediastinal lymphadenopathy; extrapulmonary TB (EPTB); or both PTB and EPTB.
eDesignated MDR-TB treatment hospital in Gauteng Province.
ART: antiretroviral treatment; DHIS: district health information system; DR-TB: drug-resistant tuberculosis; EDRweb.Net: Electronic Register for Drug-Resistant Tuberculosis; ETR.Net: Electronic Register for Tuberculosis; MDR-TB: multidrug-resistant tuberculosis; NHLS: National Health Laboratory Service; TB: tuberculosis; THAT’SIT: HIV/AIDS Treatment Support and Integrated Therapy information system; PMTCT: prevention of mother-to-child transmission of HIV.
None of the included publications reported on more than 3 of the WHO data quality components.8 Definitions of the data quality component used in the different studies were rarely given, and few data quality definitions were referenced from another source, indicating a lack of standardized definitions to assess data quality. In the absence of specific definitions, the WHO data quality components’ definitions were assumed.8 Most studies had a relatively limited focus (eg, facility/community level [n = 4], district level [n = 5], provincial level [n = 2]), and thus were not representative of the whole country. The 11 studies included focused on prevention of mother-to-child transmission of HIV (PMTCT) (n = 2), tuberculosis (TB) (n = 8), and HIV and TB co-infection (n = 1). Although several studies evaluated TB data, they involved different sources at different levels of the health system (Figure 2). Drug-resistant TB studies were not included in Figure 2.
Figure 2.
TB source data flows indicating level of care evaluated.
The findings of the data quality evaluations are summarized in Table 4. For studies reporting on recording of HIV in pregnant women, there was a problem with transfer of aggregated information from facility registers to DHIS.36,37 These studies also highlighted challenges with completeness between clinic registers and DHIS, and challenges with reliability between clinic registers and monthly summary sheets. Furthermore, the data item “antenatal clients tested HIV positive” was reported as highly inaccurate by Mate et al.36 in their study setting, when compared to the facility (fixed clinic, mobile clinic, community health center, hospital) registers. Mphatswe et al.37 reported on an intervention to improve the quality of PMTCT data between facility registers and the DHIS, and significant improvements were reported as a result of the intervention.
Table 4.
Main findings of quality evaluations
Publication | RHIS | Morbidity and related data evaluated | Findings |
---|---|---|---|
A. PMTCT | |||
Mate et al. (2009)36 |
|
Antenatal care client tested HIV positive |
|
Mphatswe et al. (2012)37 | Facility PMTCT registers, DHIS | Pregnant women who tested positive for HIV |
|
B. TB | |||
Marais et al. (2006)38 |
|
|
|
Heunis et al. (2011)39 | TB patient files, electronic registers, provincial-level patient data |
|
|
Du Preez et al. (2011)40 | Electronic laboratory-based hospital surveillance database, administrative department data, hospital folders, notification records, TB meningitis home-based care program records, discharge summaries, ETR.Net |
|
|
Dunbar et al. (2011)41 | TB treatment registers, laboratory information system (LIS) (ie, NHLS), nearest central hospital, referral hospital |
|
|
Dilraj et al. (2013)42 | ETR.Net, LIS (NHLS) |
|
|
Rose et al. (2013)43 | Pediatric DR-TB register, EDRWeb.net, inpatient folders, discharge summaries, LIS (NHLS) |
|
|
Ebonwu et al. (2013)44 | LIS (NHLS), MDR TB treatment register, MDR-TB patient card |
|
|
Botha et al. (2008)45 | Facility TB register, sputum register |
|
|
C. HIV/TB co-infection | |||
Auld et al. (2013)46 |
|
|
|
Some of the studies indicated gaps in the completeness of TB data across multiple information systems, i.e., TB cases that were not being recorded.38,40,43,45 These studies showed that between 12% and 38% of cases were not being recorded in the primary data source (either patient records or laboratory records) for each study.35,40–43
Auld et al.46 focused on TB and HIV co-infection, specifically the HIV/AIDS Treatment Support and Integrated Therapy (THAT’SIT) information system, which serves as the primary source for integrated TB and HIV patient-level information in the study facilities. Data in the standard RHISs for TB and ART programs were compared to the THAT’SIT data for each case: a paper-based TB patient file (TB Blue Card), a paper-based TB register, an electronic TB register (ETR.Net), a paper-based ART patient file, and an electronic ART register (eRegister) all feed into THAT’SIT. Reliability and completeness across these 6 RHISs were assessed. Auld et al.46 found that 85% of the cases in THAT’SIT had records in all 6 systems. TB systems were almost complete for reporting on HIV co-infection (≥97%), while only 78% of the cases had TB status reported in the ART file. TB status was recorded in 97% of cases in the ART eRegister. A kappa score > 0.9 was reported for age and sex variables across systems. For clinical indicators, agreement on TB site of disease was high across the 3 TB information sources (k = 0.90) but low across all the systems (k = 0.53).
DISCUSSION
Although many studies reported poor data quality, this differed by data item, setting, and system.36–46 Assessments were performed at different points in the RHIS system38–42,45 and there was a lack of standard criteria of data quality, making it impossible to pool the findings from different studies. The WHO definitions of components for data quality provided a practical framework for our review.
The results of this analysis of 11 diverse studies give weight to the comments of multiple authors that the quality of data in RHISs in South Africa is a cause for concern.36–46 The studies cover several major programs: PMTCT, the national TB program, and the combined HIV/TB treatment program. They reflect problems at primary, secondary, and tertiary care levels of the health care system and among multiple RHISs, most notably registers of patients receiving treatment, laboratory records, and aggregated reporting systems. Our findings are echoed by a recently published comprehensive evaluation of TB surveillance data in 3 provinces by Podewils et al.47 showing that completeness and reliability were inconsistent across data sources.
Studies focusing on PMTCT highlighted problematic data transfer accuracy, which may be related to human and organizational factors rather than technical issues. Factors such as a lack of core competencies for data collection, staff attitudes toward RHIS tasks, multiplicity of registers to monitor programs, and the absence of processes for feedback and data quality checks and analysis at the facility level were highlighted by Nicol et al.16,48 Heunis et al. reported that gaps in the completeness of HIV-TB systems also relate to human and organizational factors such as staff shortages, lack of training, multiplicity of forms, and difficulty keeping track of patients.39 System issues may also hinder data quality. For example, systems like THAT’SIT (only rolled out in some provinces), which have been developed to track treatment of patients with HIV-TB co-infection, depend on data from the current TB and HIV information systems and thus inherit all of their challenges.46 The same authors noted that where mechanisms had been put in place to improve data quality, the impact was not reported.46
Data quality can also be affected by policy. One possible cause identified for problems with the level of completeness reported in TB studies was that the South African National Tuberculosis Programme approaches TB through a decentralized model and states that TB patients must be registered and recorded at the primary health care level regardless of where they are diagnosed, whether at hospitals or elsewhere.35 It seems that a formal process is not in place to ensure that cases diagnosed at hospitals are captured in the system. For example, some children, especially those diagnosed with severe TB, are diagnosed and treated only at referral hospitals and thus were not counted in the routine TB surveillance systems.40,43
While the studies cannot be assumed to be representative of the whole South African health care environment, and some were conducted more than a decade ago, they do reflect problems in a wide range of environments and give an indication as to how to address them. An underlying problem (identified in multiple studies, including those in this analysis) is that the RHISs in use in South Africa are not generally regarded as important tools to be used correctly and effectively in informing decision making from the individual patient care level to national-level treatment and/or prevention programs.
Furthermore, the results of this review indicate that the National Department of Health should be strengthening the current RHISs using the infrastructure that is already available and identifying the gaps that need to be addressed, particularly data quality. Improved data quality in RHISs has the potential to improve the accuracy of routine reporting of the morbidity profile of patients attending health facilities, which is required to support decision-making on resource allocation. Although the RHISs are in place, this study confirms the assessment by the Health Data Advisory and Co-ordination Committee that there is room for improvement.12 In addition to the RHISs discussed in the reviewed studies, other existing RHISs, such as the national cancer register,10,11 reporting on notifiable conditions,49 and injury morbidity surveillance,50 need attention. Unless data from these RHISs can be improved, it will not be possible to use them to inform burden of disease studies in the future. The National Health Normative Standards Framework was established to direct efforts to improve poor data quality (as a result of RHIS fragmentation and inoperable systems) by setting the foundations for interoperability as stated in eHealth Strategy South Africa 2012–2016.9
One local study by Mphatswe et al.37 and 2 studies from similar low-resource settings51,52 report on interventions to improve data quality, which have resulted in sustained improvement. The potential benefits of wide implementation of good practices in record-keeping, thus making the best possible use of existing RHISs, include improved patient care (since all patients requiring care would be identified and therefore monitored), improved program reporting from clinic to district and provincial levels (essential for program monitoring and resourcing), and accurate and consistent reporting at the national and international levels. A national effort by all stakeholders is required to reach this goal, which should be achievable with existing resources for health services in South Africa. Effective RHISs could have the additional benefit of providing data to inform burden of disease assessments, which are essential to integrated longer-term planning for health in South Africa.
CONCLUSION
This review has highlighted the need for a comprehensive evaluation of the quality of patient-level morbidity data in RHISs in South Africa. Even though RHISs are primarily established to produce data for public health decision making52 these data could also be extremely valuable for routine morbidity surveillance and empirical estimation of the burden of disease at the national, provincial, and district levels, if data quality could be relied upon.
The piecemeal nature of the identified work emphasizes that different levels within routine health information systems require quality assessment. A situational analysis of the availability and quality of patient-level morbidity data at public hospitals is being planned by the Burden of Disease Research Unit of the South African Medical Research Council. In the longer term, it will be important to institute a standardized data quality assurance system to ensure sustainable improvement of RHIS data quality.
ACKNOWLEDGMENTS
We thank Candy Day and Naomi Massyn for providing expert input on RHISs in South Africa, and Rory Dunbar, Richard Matzopoulos, Pren Naidoo, Megan Prinsloo, and Graham Wright for proposing additional studies for review.
FUNDING
This work was supported by the South African Medical Research Council in terms of the MRC’s Flagship Awards Project, grant number SAMRC-RFA-IFSP-01-2013/ SA CRA 2.
COMPETING INTERESTS
The authors declare that there is no conflict of interest.
CONTRIBUTORS
All the authors contributed to the study design, selection, and analysis of the reviews as well as revisions and approval of the article. R.R. conceived the study, conceptualized the article, and was responsible for searching, sourcing, and reviewing the literature and data and the study analysis. She led the interpretation of the results, wrote the first draft of the article, and revised the drafts according to inputs from the co-authors. V.PvW contributed to the conceptualization of the article, the selection and critical review of articles, the reporting structure of the review, and the write-up of all sections of the article. O.A. worked on data extraction and analysis of the article. J.J. critically reviewed the article drafts and contributed conceptually to technical discussions around terminology and definitions of quality criteria. E.N. worked on the data quality definitions framework, data extraction, and analysis of the article. D.B. contributed to the conceptualization of the article and the reporting structure of the review, and critically reviewed the drafts. L.H. contributed to the conceptualization of the article, the data quality definitions framework, the selection of articles, and the write-up of all sections of the article. All authors approved the final draft.
REFERENCES
- 1. The World Bank. Data: South Africa, 2015. http://data.worldbank.org/country/south-africa. Accessed August 22, 2015. [Google Scholar]
- 2. McIntyre D, Garshong B, Mtei G, et al. Beyond fragmentation and towards universal coverage: insights from Ghana, South Africa and the United Republic of Tanzania. Bull World Health Organ. 2008;86;871–876. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Matsoso MP, Fryatt R. National Health Insurance: the first 16 months. S Afr Med J. 2013;103;156–158. [DOI] [PubMed] [Google Scholar]
- 4. Hotchkiss DR, Diana ML, Foreit KG. How can routine health information systems improve health systems functioning in low- and middle-income countries? Assessing the evidence base. Measure Evaluation Special Report. Carolina Population Center, Chapel Hill, NC; 2012. http://www.cpc.unc.edu/measure/publications/sr-11-65. Accessed August 22, 2015. [DOI] [PubMed] [Google Scholar]
- 5. AbouZahr C, Boerma T. Health information systems: the foundations of public health. Bull World Health Organ. 2005;83;578–583. [PMC free article] [PubMed] [Google Scholar]
- 6. Evans T, Stansfield S. Health information in the new millennium: a gathering storm? Bull World Health Organ. 2003;81;856–856. [PMC free article] [PubMed] [Google Scholar]
- 7. Lippeveld T, Sauerborn R, Bodart C. Design and Implementation of Health Information Systems. Geneva: World Health Organization; 2000. http://rhinonet.org/files/2013/06/hmis_chapter10_resources.pdf. Accessed January 10, 2016. [Google Scholar]
- 8. World Health Organization. Improving Data Quality: a Guide for Developing Countries. Manila: World Health Organization Regional Office for the Western Pacific; 2003. http://www.wpro.who.int/publications/docs/Improving_Data_Quality.pdf. Accessed June 2, 2015. [Google Scholar]
- 9. National Department of Health, Council for Scientific and Industrial Research. National Health Normative Standards Framework for Interoperability in eHealth in South Africa. Department of Health, Pretoria; 2013. http://hufee.meraka.org.za/Hufeesite/staff/the-hufee-group/paula-kotze-1/hnsf-complete-version. Accessed June 20, 2015. [Google Scholar]
- 10. Norman R, Mqoqi N, Sitas F. Lifestyle-induced Cancer in South Africa. In: Steyn K, Fourie J, Temple N, eds. Chronic Diseases of Lifestyle in South Africa: 1995-2005. Cape Town: Medical Research Council; 2006. http://www.mrc.ac.za/chronic/cdl1995-2005.pdf. Accessed August 2, 2015. [Google Scholar]
- 11. Somdyala N, Bradshaw D, Gelderblom W. Cancer incidence in selected municipalities of the Eastern Cape Province, 2003–2007. Eastern Cape Province Cancer Registry Technical Report. Cape Town: South African Medical Research Council; 2013. http://www.mrc.ac.za/bod/CancerIncidenceReport2013.pdf. Accessed August 10, 2015. [Google Scholar]
- 12. National Department of Health. Health Data Advisory and Co-ordination Committee (HDACC) Report 2. Pretoria: Department of Health; 2014. [Google Scholar]
- 13. Garrib A, Herbst K, Dlamini L, et al. An evaluation of the district health information system in rural South Africa. S Afr Med J. 2008;98;7;549–552. [PubMed] [Google Scholar]
- 14. Day C, Gray A. Health and related indicators. In: Padarath A, King J, English R, eds. South African Health Review 2014/15. Durban: Health Systems Trust; 2015. http://www.hst.org.za/publications/south-african-health-review-2014/15. Accessed January 9, 2016. [Google Scholar]
- 15. Massyn N, Peer N, Padarath A, Barron P, Day C, eds. District Health Barometer 2014/15. Durban: Health Systems Trust; October 2015. http://www.hst.org.za/publications/district-health-barometer-201415-1. Accessed January 3, 2016. [Google Scholar]
- 16. Nicol E, Bradshaw D, Phillips T, et al. Human factors affecting the quality of routinely collected data in South Africa. Stud Health Technol Inform. 2013;192;788–792. [PubMed] [Google Scholar]
- 17. Pillay-van Wyk V, Msemburi W, Laubscher R, et al. Second National Burden of Disease Study South Africa: national and subnational mortality trends, 1997–2009. Lancet. 2013;381;S113. [Google Scholar]
- 18. Bradshaw D, Groenewald P, Laubscher R, et al. Initial burden of disease estimates for South Africa, 2000. S Afr Med J. 2003;93;682–688. http://www.mrc.ac.za/bod/initialbodestimates.pdf Accessed August 31, 2015. [PubMed] [Google Scholar]
- 19. Lim SS, Vos T, Flaxman AD, et al. A comparative risk assessment of burden of disease and injury attributable to 67 risk factors and risk factor clusters in 21 regions, 1990–2010: a systematic analysis for the Global Burden of Disease Study 2010. Lancet. 2012;380;2224–2260. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Heywood A, Rohde J. Using information for action. A manual for health workers at facility level. Arcadia: Equity Project; 2001. http://www.uio.no/studier/emner/matnat/ifi/INF5761/v12/undervisningsmateriale/Heywood%20and%20Rohde%20-%20Using%20information%20for%20action%20a%20manual%20for%20health%20.pdf. Accessed April 17, 2015. [Google Scholar]
- 21. Aqil A, Lippeveld T, Hozumi D. PRISM framework: a paradigm shift for designing, strengthening and evaluating routine health information systems. Health Policy Plan. 2009;24;3;217–228. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Heywood A, Boone D. Guidelines for Data Management Standards in Routine Health Information Systems. Measure Evaluation. 2015. http://www.cpc.unc.edu/measure/resources/publications/ms-15-99. Accessed December 23, 2015. [Google Scholar]
- 23. Yusof MM, Kuljis J, Papazafeiropoulou A, et al. An evaluation framework for Health Information Systems: human, organization and technology-fit factors (HOT-fit). J Am Med Inform Assoc. 2008;77;6;3386–3398. [DOI] [PubMed] [Google Scholar]
- 24. Health Metrics Network. Framework and Standards for Country Health Information Systems. 2008. http://www.who.int/healthmetrics/documents/hmn_framework200802.pdf. Accessed January 9, 2016. [Google Scholar]
- 25. De Savigny D, Binka F. Monitoring future impact on malaria burden in sub-Saharan Africa. Am J Trop Med Hyg. 2004;71 (Suppl 2):S224–S231. [PubMed] [Google Scholar]
- 26. Lau F, Hagens S, Muttitt S. A proposed benefits evaluation framework for health information systems in Canada. Health Q. 2007;101:112–116, 118. [PubMed] [Google Scholar]
- 27. Divorski S, Scheirer MA. Improving data quality for performance measures: results from a GAO study of verification and validation. Eval Program Plann. 2001;24;1;83–94. [Google Scholar]
- 28. Adeleke IT, Adekanye AO, Onawola KA, et al. Data quality assessment in healthcare: a 365-day chart review of inpatients' health records at a Nigerian tertiary hospital. J Am Med Inform Assoc. 2012;19;1039–1042. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29. Adane K, Muluye D, Abebe M. Processing medical data: a systematic review. Arch Public Health. 2013;71;27–33. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Banke-Thomas AO, Madaj B, Charles A, et al. Social Return on Investment (SROI) methodology to account for value for money of public health interventions: a systematic review. BMC Public Health. 2015;15;582–596. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Botha M, Botha A, Herselman M. The Benefits and Challenges of e-Health Applications: A Content Analysis of the South African Context. International Conference on Computer Science, Computer Engineering, and Social Media. Thessaloniki, Greece; 2014. [Google Scholar]
- 32. Royal College of Physicians. Health Informatics Unit. Hospital activity data: a guide for clinicians. London: Royal College of Physicians; 2007. http://www.hscic.gov.uk/media/1595/Hospital-Activity-Data-A-guide-for-clinicians/pdf/Hospital_Activity_Data_-_A_guide_for_clinicians.pdf. Accessed August 22, 2015. [Google Scholar]
- 33. National Department of Health, District Health Management Information System (DHMIS). Standard operating procedures: facility level. Pretoria: Department of Health; 2012. http://www.hst.org.za/publications/district-health-management-information-system-dhmis-standard-operating-procedures-facil. Accessed August 22, 2015. [Google Scholar]
- 34. World Health Organization. Guiding Principles on Ethical Issues in HIV Surveillance, 2013. http://www.who.int/hiv/pub/surveillance/2013package/module2/en/. Accessed August 22, 2015. [Google Scholar]
- 35. Ndabarora E, Chipps JA, Uys L. Systematic review of health data quality management and best practices at community and district levels in LMIC. Inform Dev. 2014;30;103–120. [Google Scholar]
- 36. Mate KS, Bennett B, Mphatswe W, et al. Challenges for routine health system data management in a large public programme to prevent mother-to-child HIV transmission in South Africa. PloS One. 2009;4;e5483. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37. Mphatswe W, Mate KS, Bennett B, et al. Improving public health information: a data quality intervention in KwaZulu-Natal, South Africa. Bull World Health Organ. 2012;90;176–182. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Marais B, Hesseling A, Gie R, et al. The burden of childhood tuberculosis and the accuracy of community-based surveillance data. Int J Tuberc Lung Dis. 2006;10;259–263. [PubMed] [Google Scholar]
- 39. Heunis C, Wouters E, Kigozi G, et al. Accuracy of tuberculosis routine data and nurses' views of the TB-HIV information system in the free state, South Africa. J Assoc Nurses AIDS Care. 2011;22;67–73. [DOI] [PubMed] [Google Scholar]
- 40. Du Preez K, Schaaf H, Dunbar R, et al. Incomplete registration and reporting of culture-confirmed childhood tuberculosis diagnosed in hospital. Public Health Action. 2011;1;19–24. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41. Dunbar R, Lawrence K, Verver S, et al. Accuracy and completeness of recording of confirmed tuberculosis in two South African communities. Int J Tuberc Lung Dis. 2011;15;337–343. [PubMed] [Google Scholar]
- 42. Dilraj A, Bristow CC, Connolly C, et al. Validation of sputum smear results in the Electronic TB Register for the management of tuberculosis, South Africa. Int J Tuberc Lung Dis. 2013;17;1317–1321. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43. Rose P, Schaaf H, Du Preez K, et al. Completeness and accuracy of electronic recording of paediatric drug-resistant tuberculosis in Cape Town, South Africa. Public Health Action. 2013;3;214–219. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44. Ebonwu J, Tint K, Ihekweazu C. Low treatment initiation rates among multidrug-resistant tuberculosis patients in Gauteng, South Africa, 2011. Int J Tuberc Lung Dis. 2013;17;1043–1048. [DOI] [PubMed] [Google Scholar]
- 45. Botha E, Den Boon S, Lawrence K, et al. From suspect to patient: tuberculosis diagnosis and treatment initiation in health facilities in South Africa. Int J Tuberc Lung Dis. 2008;12;936–941. [PubMed] [Google Scholar]
- 46. Auld S, Kim L, Webb E, et al. Completeness and concordance of TB and HIV surveillance systems for TB-HIV co-infected patients in South Africa. Int J Tuberc Lung Dis. 2013;17;186–191. [DOI] [PubMed] [Google Scholar]
- 47. Podewils LJ, Bantubani N, Bristow C, et al. Completeness and Reliability of the Republic of South Africa National Tuberculosis (TB) Surveillance System. BMC Public Health. 2015;15;765–776. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48. Nicol E. Evaluating the process and output indicators for maternal, newborn, and child survival in South Africa: A comparative study of PMTCT information systems in KwaZulu-Natal and the Western Cape. PhD Thesis. Stellenbosch University; 2015. [Google Scholar]
- 49. Weber IB, Matjila M, Harris B. Evaluation of the notifiable disease surveillance system in Gauteng Province, South Africa. Master’s Thesis. University of Pretoria; 2007. http://repository.up.ac.za/bitstream/handle/2263/26850/dissertation.pdf?sequence=1. Accessed June 17, 2015. [Google Scholar]
- 50. The Violence and Injury Surveillance Consortium, National Non-Fatal Injury Surveillance System: Pilot study report. South African Medical Research Council, Cape Town, South Africa; 2000. [Google Scholar]
- 51. Wagenaar BH, Gimbel S, Hoek R, et al. Effects of a health information system data quality intervention on concordance in Mozambique: Time-series analyses from 2009–2012. Popul Health Metr. 2015;13;9–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52. Mutale W, Chintu N, Amoroso C, et al. Improving health information systems for decision making across five sub-Saharan African countries: Implementation strategies from the African Health Initiative. BMC Health Serv Res. 2013;13 (Suppl 2);S9–S21. [DOI] [PMC free article] [PubMed] [Google Scholar]