Skip to main content
. 2021 Aug 26;11(8):e045704. doi: 10.1136/bmjopen-2020-045704

Table 4.

Summary of evidence related to methodological issues for linking analyses and related needs for future research

Suitability of household and provider data for linking analyses
  • Need valid data on target population for the intervention, and suitable data on service contact/care-seeking

  • Need provider data reflective of select aspects of QoC, standardised indices and clear interpretation of measures

Issue Evidence Action
How valid are data on target population for interventions?
  • Symptom/diagnosis-based conditions may be biased.

  • Rare conditions are not captured with sufficient sample.

Explore alternative methods for defining population in need (eg, biomarkers, Bayesian modelling of disease probability).
How valid are data on care-seeking?
  • Limited data suggest respondent able to identify type of provider but not type of health worker.

  • Inconsistent and sometimes poorly defined provider categories.

  • Validate care-seeking in more settings/health areas.

  • Align categories of care across data collection tools.

How are QoC data being collected and what are the limitations of these methods?
  • Mostly through health facility surveys.

  • HMIS data not widely used—limited QoC data collected.

  • Alternative methods (record review, provider or client report, etc) correlate poorly with provision of services/process quality.

  • Assess validity of existing QoC measurement methods.

  • Assess availability/usability of HMIS data for EC estimation.

  • Develop and test new methods for assessing provision of care and experience of care.

How are quality measures being constructed and what do we know about the performance of these indices?
  • Mostly SPA/SARA structural data, limited indicators on provision or experience of care, EmONC signal functions.

  • Variable set of indicators used based on guidelines and standards.

  • Many methods for combining indicators have been tried.

  • Handful of studies comparing methods produced conflicting results.

Develop standardised and validated summary QoC measures.
How well do measures of quality track with each other, clinical quality and/or health benefit?
  • Limited evidence of weak or no association between (1) structural and process quality, (2) measured quality and clinical care/health outcomes.

Standardise methods and terminology for defining and interpreting QoC measures to more accurately reflect role in the coverage cascade.
Implications of design of existing household and health provider data sources commonly used in linking analyses
  • DHS/MICS household location unknown, cluster location displaced and may introduce imprecision into ecological linking analyses.

  • SPA/SARA often use sample of facilities and subsample of client–staff interactions that may not be representative of true service environment.

  • Household and provider surveys are sampled and conducted independently → data are typically temporally and geographically discordant.

Issue Evidence Action
Does imprecise DHS/MICS household location data affect ecological linking results? Handful of studies suggest minimal effect on results produced by linking on geographical proximity. Assess impact of household vs cluster centroid location vs displaced centroid in ecological linking analyses in multiple settings.
How does SPA/SARA sampling design affect estimates?
  • Two studies suggest impact of excluding non-facility providers is context specific.

  • Client-staff interactions sampled to be representative at same level as overall survey—not at facility level.

  • One study showed sampling of facilities resulted in moderate misclassification of service environment across linking methods.

  • Joint sampling method proposed in 2001—oversample providers around sampled household clusters.

  • Assess effect of provider sampling (vs census) on linked estimates.

  • Assess effect of within-facility sampling of healthworkers and client-healthworker observations.

  • Triangulate with other sources of facility data (eg, HMIS) to take advantage of the greater detail of the SPA assessment with the bigger sample of the facility records.

  • Account for uncertainty in estimates based on the facility-level data (eg, multilevel structure).

  • Test alternative sampling methods to improve representativeness of provider survey sampling for clients and healthworkers.

  • Test joint sampling methods for EC estimation.

How stable are indicators over time?
  • Studies demonstrate moderate indicator variability over months/years.

  • No studies directly related to effect on linking analyses.

  • Assess stability of key provider and household indicators.

  • Develop and test methods to account for unstable estimates, including more frequent data collection methods (eg, through HMIS) if needed.

Impact of choice of method for combining household and provider data
  • Multiple approaches for combining data sets, each with strengths and limitations.

  • Exact match linking based on specific source of care most precise but ecological linking based on geographical proximity or administrative unit is more feasible.

Issue Evidence Action
How do exact match and ecological linking approaches compare?
  • Three studies found ecological methods produced estimates similar to exact match under certain conditions in settings with high use of public providers.

  • Restricting analyses by source of care category and/or weight by utilisation volume improved agreement with exact match.

  • Assess performance of ecological methods in settings with greater variation in provider landscape, provider quality.

  • Define guidance, such as provider quality variation thresholds, for selection of linking method.

How do different ecological linking methods and measures of geographical proximity perform?
  • Similar results using straight-line, road distance and travel time.

  • Variable performance of ecological methods in identifying true source of care/ reported category of care.

  • Identify preferred measures of geographical proximity to use in linking analyses.

  • Create standard, accessible tools for conducting ecological linking.

What are the statistical challenges in combining data for effective coverage estimation?
  • Most analyses derive estimate variance from household sampling error.

  • Two papers used delta method, but no comparison to other methods.

  • Simulation found variance estimation using delta method performed better than household error alone or parametric bootstrapping.

Continue developing tools and approaches for estimating uncertainty around linked estimates.

DHS, Demographic and Health Survey; EC, effective coverage; EmONC, emergency obstetric and newborn care; HMIS, Health Management Information Systems; MICS, Multiple Indicator Cluster Survey; QoC, quality of care; SARA, Service Availability and Readiness Assessment; SPA, Service Provision Assessment.