Skip to main content
Campbell Systematic Reviews logoLink to Campbell Systematic Reviews
. 2022 Dec 13;18(4):e1294. doi: 10.1002/cl2.1294

Communicating and using systematic reviews—Learning from other disciplines

Jonathan Breckon 1,
PMCID: PMC9745728  PMID: 36908845

The single biggest problem with communication is the illusion that it has taken place. George Bernard Shaw

Since its creation in 2000, the Campbell Collaboration has come a long way to bolster the use of high‐quality reviews amongst policymakers and practitioners. From plain language summaries, to evidence and gap maps, strategic partnerships with government departments, and much more, Campbell has prioritised systematic review knowledge mobilisation (White & Welch, 2022).

However, more could be done. Disseminating evidence alone does little to improve its use, and is likely to fail to address practical, cultural or institutional barriers to engagement (Langer et al., 2016). The danger, as captured in the George Bernard Shaw quote above, is that all the expended energy in communication—websites, meetings, reports, partnerships—can fool us into a false sense that we made a difference.

One way we can grow our communication savviness is to look to other professions and disciplines. Too often we work in silos and there is much we can learn from other subjects (Oliver & Boaz, 2019). This might be beyond the ‘usual suspects’ of health research that has dominated the field of research use, to other areas of the social sciences, the creative arts, design, marketing, environmentalism and behaviour change.

To learn from these other fields, Campbell UK & Ireland organised a series of seminars and podcasts with leading thinkers from cognitive psychology, communication studies, science and technology studies, political science, design and more. The aim was to gather actionable insights and provide pause for thought on how we go about communicating our reviews. A summary of some of the lessons that I think are of most value are set out below.

1. REFRAMING THE MESSAGE—INSIGHTS FROM PSYCHOLOGY

First, there is clearly a lot we can learn from psychology. For example, although it is commendable that all Campbell reviews require a plain language summary, we can always reframe our core messages for impact. To use a simplistic example from marketing, a yoghurt that is sold as ‘10 percent fat’ is a lot less appealing than a yoghurt that is ‘90 percent fat free’. Specific research synthesis messages can land very differently according to what you emphasise, or indeed what you leave unsaid, according to the US‐based FrameWorks Institute that spoke at our first event.

Other ‘frame elements’ can be used to help your message land, such as use of metaphors, storytelling and narratives (Davidson, 2017). We could probably do more to get out of our comfort zone and be get creative in communicating research synthesis. Joe Langley, a Design Engineer at Lab4Living, described the value of using games, prototyping, even comics to engage users in research (Langley et al., 2022). Demand for Evidence and Gap Maps continues to grow, but are there more visually impactful products we could try? Applied cognitive psychologist Jordan Harold at the Tyndall Centre for Climate Change Research has provided advice on Enhancing the accessibility of climate change data visuals for the IPCC, the Intergovernmental Panel on Climate Change (Harold et al., 2020). Harold's ‘cognitive perceptual design principles’, could inform prototyping of new Campbell apps, interactive visual tools, and decision‐aids.

2. GET YOUR AUDIENCE INVOLVED

Another good piece of advice is to test out communication products on your audience. Perhaps with focus groups or some ethnographic work. What do your target audiences make of it? Their responses might surprise you. When I worked with the design company FutureGov on a prototype Evidence Store for the UK Government‐funded What Works Centre for Children's Social Care, I was surprised by how social workers implored us to keep it the website as simple as possible. They wanted us to ruthlessly strip out any superfluous images, numbers or text. Which we did, even if the final result seems very basic. We also, at their request, gave more contextual information—helping to see how the evidence synthesis might be implemented in everyday practice.

Such a user‐centric approach to evidence visualisations was also recommended by Trish Greenhalgh. A former General Practitioner and Professor of Primary Care Health Sciences at Oxford University, Greenhalgh stressed in one of our seminars the importance of grounding visualisations in the needs of audiences, such as the UK's Zoe app of Covid‐19 which had over two million users. The fact that the general public, and not just epidemiologists, were using the app helped to create visualisations that were relevant to them, according to Greenhalgh.

3. STRUCTURED ENGAGEMENT WITH THE PUBLIC

Engaging directly with the public can be hazardous. The danger is we lose all the hard‐earned exhaustiveness and diligence of systematic reviews, when working with potentially small and unrepresentative groups of experts or the public. A degree of rigour is needed, such as the use of Delphi panels, the staged group communication process, which were recommended by Nibedita Mukherjee, Lecturer in Global Development at Brunel University (Mukherjee et al., 2015). For Mukherjee, Delphi panels have two main benefits for systematic reviewers. First, they can help bring valuable stakeholder views into areas that are highly contested and where there is no consensus within the research literature. Second, Delphi panels are beneficial when the evidence is missing, and we need to go to subject experts to fill the gap.

Whatever the preferred method of involving people—and there are many (CEDIL, 2022)—some degree of structure is needed, and one that is appropriate for what you are trying to do. Campbell is exploring the development of professional evidence‐based guidelines (White & Welch, 2022), and many health guidelines strive to involve patient groups and experts. Campbell should consider what might be the best methods for deliberative techniques for guideline production.

4. UNDERSTANDING POLITICS AND GETTING THE TIMING RIGHT

Bringing in the public and other experts has its value, but real influence can lie elsewhere: in government, regulatory or other positions of power. Political scientists can help us navigate this field (Cairney 2019). For example, researchers should look for ‘policy windows’ (Rose et al., 2020) so reviews can land on the policymaker's desk at an opportune time. Too early and it may be ignored. Too late and the policy may be a done deal.

Getting the timing right might mean moving faster than we are used to. Jennifer Stuttle, Head of Profession for Evaluation and Government Social Research at the UK Foreign, Commonwealth & Development Office, told us how useful she found rapid reviews for her policy colleagues, compared to full blown systematic reviews. The speed of the reviews matched the speed of policymakers’ needs. But even if there are time pressures, Stuttle also recommended squeezing in as much time as possible on getting the research questions and scoping right at the start. Failure to do so could create reviews that are ignored as they don't answer the right policy questions.

5. EMBRACING COMPLEXITY AND A DIVERSITY OF VIEWS

We should also not just fall into the trap of thinking that simplicity and shortness is the always the answer. Andy Stirling, Professor of Science and Technology Policy at the University of Sussex, who has sat on nine government advisory boards, encouraged us to be open about our disagreements and disputes, and not pretend there is always consensus. When knowledge is uncertain, we should avoid pressures to simplify policy advice and ‘keep it complex’ (Stirling, 2010).

Embracing complexity may mean more pluralistic sources of evidence. Trish Greenhalgh gave the example of reviews of face masks: reviews that purely focused on randomised controlled trials found little impact. Reviews that were more inclusive were more positive about face masks, including observation studies, anthropology, engineering, basic science, mathematical modelling (Greenhalgh, 2020). There is no static ‘body of evidence’ but a moving ‘sea of evidence’, according to Greehalgh. We may need to be open to different types of reviews that do more to account for dynamic systems and complexity (Petticrew et al., 2019). For example, Joe Langley expressed how valuable he found realist synthesis in his creative codesign (Pawson et al., 2005).

6. DON'T ASSUME READY‐MADE DEMAND FOR SYSTEMATIC REVIEWS

A priority for the future of Campbell should be continuing to make the case for the value of reviews. We cannot assume there is a large untapped demand. My own recent experience working inside Whitehall and Westminster is there is a distinct lack of understanding on systematic reviews. I have had to roll back from developing rapid synthesis methods, to making the case for why policymakers should bother in the first place.

Over recent years, we have been lucky to have entrepreneurial networkers and communicators to promote systematic reviews. Yet more could be done as a whole community to make the case, and build a stronger evidence base on the costs and benefits of reviews. For example, we could marshal more evidence and case studies on the dangers of biases when using groups of experts in policy, or the pitfalls of traditional ‘cherry‐picking’ in policy literature reviews.

Over the last two decades, we have come a long way in the technical sophistication of our review methods and practices. We should also give equal energy to looking outwards, and embracing innovations and other ways of communicating and implementing our reviews.

REFERENCES

  1. Centre of Excellence for Development Impact and Learning (CEDIL) . (2022, May 16). CEDIL Methods Brief 5. https://cedilprogramme.org/publications/cedil-methods-brief-5/
  2. Davidson, B. (2017). Storytelling and evidence‐based policy: Lessons from the grey literature. Palgrave Communications, 3, 17093. 10.1057/palcomms.2017.93 [DOI] [Google Scholar]
  3. Greenhalgh, T. (2020). Face coverings for the public: Laying straw men to rest. Journal of Evaluation in Clinical Practice, 26(4), 1070–1077. 10.1111/jep.13415 [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Harold, J. , Lorenzoni, I. , Shipley, T. F. , & Coventry, K. R. (2020). Communication of IPCC visuals: IPCC authors' views and assessments of visual complexity. Climatic Change, 158(2), 255–270. 10.1007/s10584-019-02537-z [DOI] [Google Scholar]
  5. Langer, L. , Tripney, J. , & Gough, D. (2016). The science of using science: Researching the use of research evidence in decision‐making . EPPI‐Centre, Social Science Research Unit, UCL Institute of Education, University College London. [Google Scholar]
  6. Langley, J. , Kayes, N. , Gwilt, I. , Snelgrove‐Clarke, E. , Smith, S. , & Craig, C. (2022). Exploring the value and role of creative practices in research co‐production. Evidence & Policy, 18(2), 193–205. 10.1332/174426421X16478821515272 [DOI] [Google Scholar]
  7. Mukherjee, N. , Hugé, J. , Sutherland, W. J. , McNeill, J. , Van Opstal, M. , Dahdouh‐Guebas, F. , & Koedam, N. (2015). The Delphi technique in ecology and biological conservation: Applications and guidelines. Methods in Ecology and Evolution, 6(9), 1097–1109. [Google Scholar]
  8. Oliver, K. , & Boaz, A. (2019). Transforming evidence for policy and practice: Creating space for new conversations, Palgrave Communications 5(1), Article 1. [Google Scholar]
  9. Pawson, R. , Greenhalgh, T. , Harvey, G. , & Walshe, K. (2005). Realist review—A new method of systematic review designed for complex policy interventions. Journal of Health Services Research & Policy, 10(Suppl. 1), 21–34. 10.1258/1355819054308530 [DOI] [PubMed] [Google Scholar]
  10. Petticrew, M. , Knai, C. , Thomas, J. , Rehfuess, E. A. , Noyes, J. , Gerhardus, A. , Grimshaw, J. M. , Rutter, H. , & McGill, E. (2019). Implications of a complexity perspective for systematic reviews and guideline development in health decision making. BMJ Global Health, 4(Suppl. 1), e000899. 10.1136/bmjgh-2018-000899 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Rose, D. C. , Mukherjee, N. , Simmons, B. I. , Tew, E. R. , Robertson, R. J. , Vadrot, A. B. M. , Doubleday, R. , & Sutherland, W. J. (2020). Policy windows for the environment: Tips for improving the uptake of scientific knowledge. Environmental Science & Policy, 113, 47–54. 10.1016/j.envsci.2017.07.013 [DOI] [Google Scholar]
  12. Stirling, A. (2010). Keep it complex. Nature, 468(7327), Article 7327. 10.1038/4681029a [DOI] [PubMed] [Google Scholar]
  13. White, H. , & Welch, V. (2022). Research—What is it good for? Absolutely nothing unless it is used to inform policy and practice. Campbell Systematic Reviews, 18, e1276. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Campbell Systematic Reviews are provided here courtesy of Wiley

RESOURCES