Skip to main content
The British Journal of General Practice logoLink to The British Journal of General Practice
editorial
. 2014 Feb;64(619):65–66. doi: 10.3399/bjgp14X676979

Ecological studies: use with caution

Catherine Saunders 1, Gary Abel 2
PMCID: PMC3905433  PMID: 24567587

BACKGROUND

Ongoing debates about the quality of NHS organisations have made ecological studies fashionable. One such study in the UK considered the association between the average clinical quality of primary care provided by primary care trusts and the trusts’ rate of admission for coronary heart disease.1 An alternative to this ecological approach, which used data aggregated to the level of a primary care trust, would have been to have used data for individual patients, and to have asked if there was an association between the clinical quality of care an individual receives and their own chance of being hospitalised for coronary heart disease. By using aggregated data in ecological studies the relationship for individual patients is not directly explored, although individual relationships may often (correctly or incorrectly) be inferred from population-based analyses. Ecological studies can either be descriptive, for example, exploring variation between populations, or consider associations such as the example above. In health services research, where healthcare organisations rather than individual patients are often the focus of inquiry, ecological studies are often an appealing tool. For example, a recent study looking at the features of general practices associated with lower coronary heart disease mortality was more concerned with the practice at an organisational level than with individuals.2

STRENGTHS OF THIS APPROACH: OPEN DATA AND ORGANISATION ANALYSES

The availability of data describing NHS organisations has never been greater. A large volume of UK healthcare process and outcome data is becoming publicly available from the Health and Social Care Information Centre (http://www.hscic.gov.uk/) and via the government open data website (http://www.data.gov.uk/) with GP practice, hospital, and clinical commissioning group (CCG) indicators available. Indicators cover measures including population, service, clinical outcome, prescribing and patient experience. The UK Data Archive (http://data-archive.ac.uk/) is another source of publicly available data, including individual level data from health surveys. With the UK white paper from 2012 presenting the government strategy to make more data public3 the amount of data available is only going to increase.

One strong advantage of using publicly available data in research is that there are no problems with data confidentiality. When looking at associations between risk factors and outcomes, linking information about individual patients often requires extensive ethical and governance approval. Linking data at the organisational level, however, does not, as usually the data have been published and are in the public domain already. Ecological studies also allow us to look nationwide providing evidence that is potentially more generalisable than from studies considering individuals, but in only a small geographical area. A further strength of ecological studies is that where data are available the exploration of potential trends over time can be considered with relative ease.

The strength of an ecological study for looking at associations at the institutional level (hospital, CCG or GP practice) is balanced by the fact that we cannot draw conclusions about individual patients from population data. We can tell whether GP practices where patients report a better experience of care are those that achieve higher QOF targets,4 but this tells us nothing about the association between patient experience and the quality of clinical care at the patient level. Ecological studies are at best hypothesis generating when considering individual level associations and care is needed to avoid the risk of ecological fallacy: assuming the associations that exist at the population level persist at the individual level. Ecological analyses which consider within-institution trends-over-time are less vulnerable to these problems, but not immune.

CAUTIONS: CONFOUNDING, BIAS, AND ECOLOGICAL FALLACY

Considerations applicable to any type of epidemiological research also apply to ecological studies, for example in relation to potential confounding (where two domains of care appear associated, but this is in fact simply because they are both associated with a third, confounding, variable). Where there is confounding by individual level variables (such as clinical diagnosis or disease severity case mix or sociodemographic variation), then if individual level data are available for one of the measures of interest, accounting for potential confounding at the ecological level is possible.5 Information may also be available at the organisational level about possible confounders, but individual level data are needed if individual level confounders are the concern.

The importance of these last two points (potential for ecological fallacy and unmeasured confounding), when interpreting correlations observed at the organisational level, cannot be overstated. A simple, unadjusted, correlation of two measures at the population level has the potential for eye-catching headlines, such as the study of the association between chocolate consumption and winning a Nobel prize.6 However, the potential for ecological studies to lead into suboptimal policy-making is high; confounding and ecological fallacy mean that an unthinking analysis of associations at the organisational rather than the individual level may have far reaching consequences. Recently it has been claimed that NHS hospitals that operate in a more competitive geographical environment have a lower mortality rate for patients with myocardial infarction.7 Whether this association was causal or not has been a subject of a lively debate.8

Data completeness is also important. Complete and accurate data is incentivised in the NHS, but there remains variation in quality and validity across organisations. For example, exception reporting varies across GP practices in the UK9 and there is considerable variation in data quality in hospital-acquired infection surveillance.10 Measurement bias (where errors in data measurement are associated with healthcare organisation performance) can also be a concern even using standardised publicly reported data. Further, where data is sparse, confidentiality requirements in the public reporting of data means that information is suppressed in public sources where it may be individually identifiable; for example, data may be disproportionately more likely to be missing for single-handed GP practices.

FURTHER CONSIDERATIONS: POWER AND RELIABILITY

Other methodological questions should also be considered. The statistical reliability of the measures in question at the organisational level are important to consider.11 Additionally, if several comparisons are being made then statistical tests should be adjusted for multiple testing. The temptation to start correlating everything with everything else, just because the data are freely available and accessible, should be avoided and analyses should be hypothesis-led wherever possible.

Analyses also need to be adequately powered. For example, given there are only around 160 hospitals in England, a study using all of these would have 80% power to detect a correlation of 0.22. While this would not be described as a strong correlation it is larger than values often found in ecological studies. The fact that only relatively strong associations will ever be detected by ecological studies of this sample size potentially encourages the publication of false-positive results as any statistically significant finding accompanies a large effect size. Similar cautions apply to ecological studies in general practice settings when only a small geographical area is considered (for example, within a CCG). Additionally, if the measurement of organisation performance does not have high reliability then power will be further decreased.

BEST PRACTICE AND CONCLUSIONS

The need for good practice in working with and reporting health services research carried out using routine health data are clearly wider than the epidemiological concerns about the ecological study design alone. The RECORD (the REporting of studies Conducted using Observational Routinely-collected Data) statement, an extension of STROBE, (STrengthening the Reporting of OBservational studies in Epidemiology) is in development, defining reporting guidelines for observational studies using health data routinely collected for non-research purposes.

Ecological studies in health services research are a powerful tool and with the wealth of organisational level data now available, there are increasing numbers of research questions where they are the study design of choice. However, the potential for over-interpretation of results and generation of spurious findings is ever present. Good practice in the use of routine health data for research and the use of standard epidemiological precautions are necessary when carrying out and interpreting these studies.

Acknowledgments

We thank Dr Georgios Lyratzopoulos (Cambridge Centre for Health Services Research) for helpful comments and his critical review of the manuscript.

Provenance

Freely submitted; not externally peer reviewed.

Competing interests

The authors have declared no competing interests.

REFERENCES

  • 1.Bottle A, Gnani S, Saxena S, et al. Association between quality of primary care and hospitalization for coronary heart disease in England: national cross-sectional study. J Gen Intern Med. 2008;23(2):135–141. doi: 10.1007/s11606-007-0390-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Levene LS, Baker R, Bankart MJG, Khunti K. Association of features of primary health care with coronary heart disease mortality. JAMA. 2010;304(18):2028–2034. doi: 10.1001/jama.2010.1636. [DOI] [PubMed] [Google Scholar]
  • 3.UK Government Cabinet Office Unleashing the potential. CM8353 Open Data White Paper. 2012. http://www.data.gov.uk/sites/default/files/Open_data_White_Paper.pdf (accessed 13 Jan 2014).
  • 4.Llanwarne NR, Abel GA, Elliott MN, et al. Relationship between clinical quality and patient experience: analysis of data from the English Quality and Outcomes Framework and the national GP Patient Survey. Ann Fam Med. 2013;11(5):467–472. doi: 10.1370/afm.1514. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Rowan K, Harrison D, Brady A, Black N. Hospitals’ star ratings and clinical outcomes: ecological study. BMJ. 2004;328(7445):924–925. doi: 10.1136/bmj.38007.694745.F7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Messerli FH. Chocolate consumption, cognitive function, and Nobel laureates. N Engl J Med. 2012;367(16):1562–1564. doi: 10.1056/NEJMon1211064. [DOI] [PubMed] [Google Scholar]
  • 7.Cooper Z, Gibbons S, Jones S, McGuire A. Does hospital competition save lives? evidence from the English NHS patient choice reforms. Econ J. 2011;121(554):F228–F260. doi: 10.1111/j.1468-0297.2011.02449.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Bloom N, Cooper Z, Gaynor M, et al. In defence of our research on competition in England’s National Health Service. Lancet. 2011;378(9809):2064–2065. doi: 10.1016/S0140-6736(11)61708-X. author reply 2065–2066. [DOI] [PubMed] [Google Scholar]
  • 9.Doran T, Fullwood C, Reeves D, et al. Exclusion of patients from pay-for-performance targets by English physicians. N Engl J Med. 2008;359(3):274–84. doi: 10.1056/NEJMsa0800310. [DOI] [PubMed] [Google Scholar]
  • 10.Tanner J, Padley W, Kiernan M, et al. A benchmark too far: findings from a national survey of surgical site infection surveillance. J Hosp Infect. 2013;83(2):87–91. doi: 10.1016/j.jhin.2012.11.010. [DOI] [PubMed] [Google Scholar]
  • 11.Lyratzopoulos G, Elliott MN, Barbiere JM, et al. How can health care organizations be reliably compared? Lessons from a national survey of patient experience. Med Care. 2011;49(8):724–733. doi: 10.1097/MLR.0b013e31821b3482. [DOI] [PubMed] [Google Scholar]

Articles from The British Journal of General Practice are provided here courtesy of Royal College of General Practitioners

RESOURCES