Skip to main content
. 2023 May 11;8(1):2113. doi: 10.23889/ijpds.v8i1.2113

Table 3: Challenges, lessons learned, and suggestions for improvement for observational studies using administrative data to evaluate complex interventions.

Challenges and lessons learned Suggestions for improvement
  • Evaluation of complex interventions requires detailed national and local data on programme implementation about who is eligible, approached, enrolled in the intervention with similar information for usual care. This information is crucial to minimise biases, enable fair and robust comparisons, and increase confidence that differences in outcomes can be attributed to the intervention, rather than to the characteristics of the people selected for intervention.

  • Researchers should work in partnership with practitioners, commissioners and communities to ensure that evaluations are integrated into the design and implementation of interventions.

  • Programme managers and care leads shoulddocument detailed information about programme delivery and usual care (including activity dates and catchment area), across local areas and over time.

  • Programme managers should ensure detailed information are recorded on the characteristics of those who are approached and offered an intervention, and those who declined.

  • Programme managers should provide consistent guidelines aboutprogramme targeting and prioritisation, where resources are insufficient for universal offer. Targeting should be documented in detail, including where guidelines change over time ordiffer across local areas.

  • Information on linkage data quality can be limited, making it challenging to define accurate denominators and comparator groups.

  • Linkage organisations should provide detailed data on linkage quality (see GUILD reporting tool...[41]).

  • Constructing a comparable control group is limited by measured characteristics, introducing the possibility of unmeasured confounding.

  • Researchers should assess the likelihood of unmeasured confounding.

  • Interpreting outcomes reported in administrative data – particularly regarding health or social services contact – is challenging without accurate and complete measures of need.

  • Researchers should conduct and funders should fund process evaluations and qualitative studies alongside quantitative impact analyses.

  • Data approval and access delays may impede substantially on data analysis time, even when applications are submitted several years before the planned grant start date.

  • Data providers should streamline processes to minimise data access delays and enable timely information for evidence-based policy-making.