Skip to main content
BMJ Open logoLink to BMJ Open
. 2019 Dec 17;9(12):e031510. doi: 10.1136/bmjopen-2019-031510

Informed Health Choices media intervention for improving people’s ability to critically appraise the trustworthiness of claims about treatment effects: a mixed-methods process evaluation of a randomised trial in Uganda

Daniel Semakula 1,2, Allen Nsangi 1,2, Andrew Oxman 3,, Claire Glenton 3, Simon Lewin 3, Sarah Rosenbaum 3, Matt Oxman 3, Margaret Kaseje 4, Astrid Austvoll-Dahlgren 5, Christopher James Rose 3, Atle Fretheim 3, Nelson Sewankambo 1
PMCID: PMC6937069  PMID: 31852697

Abstract

We developed the Informed Health Choices podcast to improve people’s ability to assess claims about the effects of treatments. We evaluated the effects of the podcast in a randomised trial.

Objectives

We conducted this process evaluation to assess the fidelity of the intervention, identify factors that affected the implementation and impact of the intervention and could affect scaling up, and identify potential adverse and beneficial effects.

Setting

The study was conducted in central Uganda in rural, periurban and urban settings.

Participants

We collected data on parents who were in the intervention arm of the Informed Health Choices study that evaluated an intervention to improve parents’ ability to assess treatment effects.

Procedures

We conducted 84 semistructured interviews during the intervention, 19 in-depth interviews shortly after, two focus group discussions with parents, one focus group discussion with research assistants and two in-depth interviews with the principal investigators. We used framework analysis to manage qualitative data, assessed the certainty of the findings using the GRADE-CERQual (Grading of Recommendations, Assessment, Development and Evaluations-Confidence in the Evidence from Reviews of Qualitative Research) approach, and organised findings in a logic model.

Outcomes

Proportion of participants listening to all episodes; factors influencing the implementation of the podcast; ways to scale up and any adverse and beneficial effects.

Results

All participants who completed the study listened to the podcast as intended, perhaps because of the explanatory design and recruitment of parents with a positive attitude. This was also likely facilitated by the podcast being delivered by research assistants, and providing the participants with MP3 players. The podcast was reportedly clear, understandable, credible and entertaining, which motivated them to listen and eased implementation. No additional adverse effects were reported.

Conclusions

Participants experienced the podcast positively and were motivated to engage with it. These findings help to explain the short-term effectiveness of the intervention, but not the decrease in effectiveness over the following year.

Keywords: process evaluation, fidelity, podcast, barriers, facilitators, scaling-up, adverse effects, critical appraisal, evidence-informed decision-making, edutainment, health communication, media interventions


Strengths and limitations of this study.

  • The study employed multiple methods both quantitative and qualitative which allowed us to understand the findings better.

  • Numerous interviews of different kinds (eg, short postepisode evaluation interviews, in-depths interviews and focus group discussions) enabled us to have rich data from which to draw conclusions.

  • We were not able to interview participants who dropped out of the main trial. There is a possibility that those who dropped out might have had different experiences.

Background

Claims about what we should do to improve or maintain our health are abundant in mass media and elsewhere. Some are about the effects of contemporary medicines and surgical interventions, while others are about other types such as traditional alternative therapeutic, and palliative interventions. For example, there are numerous unfounded claims in the media that vaccines cause autism and a host of adverse effects, claims about herbal remedies having no adverse effects on account of being ‘natural’, and claims that using antiretroviral drugs harms more than it helps. Most people lack the aptitude necessary to critically appraise the trustworthiness of claims about the benefits and harms of treatments.1–4 For example, many people trust in their own or acquaintances’ lived experiences with health and illness more than research evidence5 and many commonly overestimate the benefits and underestimate the harms of treatments.6 7 Individuals who are unable to critically assess treatment claims are prone to making inappropriate health choices or use interventions inappropriately. Indeed, many people make decisions based on untrustworthy claims every day. For example, because of exaggerated and unfounded fears about purported side effects, there is vaccine hesitancy and non-vaccination in many parts of the world.8–10 Acting on unreliable claims can result in unnecessary suffering and death,11 and plenty of resources wasted on ineffective and sometimes harmful treatments.12 Conversely, failure to act on trustworthy information results in inefficient use of effective health services.13 A recent study revealed that patients who chose against treatments of known effectiveness and safety profiles experienced comparatively reduced survival rates.14 Unfortunately, many programmes simply tell people what to do, without empowering them to critically appraise health-related information. People need to be supported to develop the skills necessary to critically assess the trustworthiness of claims about treatment effects and to make informed health choices.

To respond to this need, the Informed Health Choices (IHC) project15 16 developed and evaluated materials to enable people understand and apply Key Concepts that are necessary for critically appraising claims about treatment effects and making informed health choices.15 16 By ‘treatment’ we mean any action intended to maintain or improve the health of individuals or communities.

As part of the IHC project, we prepared a podcast (box 1) to help improve people’s ability to assess the trustworthiness of claims about treatment effects.17 It was designed for the parents of primary school children. Each episode comprises a story (radio theatre) about a treatment claim, a message about one Key Concept that is important for assessing that claim, an explanation and an example illustrating the concept. The podcast was developed iteratively, using a human-centred design approach.18 We used feedback from the target audience on early versions to ensure that they experienced the podcast positively. The development process is described elsewhere.17

Box 1. The Informed Health Choices podcast.

The Informed Health Choices (IHC) podcast was designed to teach the parents of primary school children to assess claims about treatment effects and to make informed health choices. Each episode included a short story with an example of a treatment claim, a simple explanation of a concept used to assess that claim, another example of a claim illustrating the same concept and its corresponding explanation. In each story, there was a question about the trustworthiness of a claim, which was resolved by applying the relevant Key Concept. 15 All episodes had a conclusion with a take-home message emphasising the concept. The examples used in the podcast were for claims about treatments for health conditions such as malaria, diarrhoea and HIV/AIDS, which were of interest to our target audience at the time;. We also included claims about some common practices, such as contraception, which were of interest to our audience at the time.

The topics and claims were identified from scanning recent mass media reports and interviewing parents. There are eight main episodes in the series covering the nine Key Concepts (listed below). Each episode lasted about 5 min. One of the episodes (episode one) covered two closely related Key Concepts (1 and 9 below). Two additional episodes introduced the podcast and summarised the key messages from the first eight episodes, respectively. The final structure, content, presentation of the content in each episode was developed using a human-centred design approach.17 This involved many iterations informed by feedback from various stakeholders, including parents in our target audience, on the appropriate content to be included and the presentation of this content in each episode. Each episode of the podcast was produced in two languages: English and Luganda. Parents had an option of listening to the podcast in either of the two languages according to their preferences.

­

The nine Key Concepts included in the podcast: 17 46

  • Treatments may be harmful People often exaggerate the benefits oftreatments and ignore or downplay potential harms. However, few effectivetreatments are 100% safe. (Included in Episode 1)

  • Personal experiences or anecdotes (stories abouthow a treatment helped or harmed someone) are an unreliable basis for determiningthe effects of most treatments. (Included in Episode 3)

  • A treatment outcome maybe associated witha treatment, but not caused by the treatment. (Included in Episode 4)

  • How widely or how long a treatment is used is not a reliable indicator of how beneficial or safe it is. Treatments that have not been properly evaluatedbut are widely used or have been used for a long time are often assumed towork. Sometimes, however, they may be unsafe or of doubtful benefit. (Included in Episode 5)

  • Opinions of experts or authorities do not alone provide a reliable basis for deciding on the benefits and harms of treatments. Doctors, researchers, patient organisations andother authorities often disagree about the effects of treatments. This may bebecause their opinions are not always based on systematic reviews of faircomparisons of treatments. (Included in Episode 6)

  • Evaluating the effects of treatments depends on making appropriate comparisons. If a treatment is not compared to somethingelse, it is not possible to know what would happen without the treatment, so itis difficult to attribute outcomes to the treatment. (Included in Episode 2)

  • Comparisons of treatements must be fair. Apart from the treatments being compared, the comparison groups need to be similar at the beginning of a comparison (ie, ‘like needs to be compared with like’). (Included in Episode 7)

  • The results of single comparisons of treatments (trials) can be misleading. A single comparison of treatments rarelyprovides conclusive evidence and results are often available from othercomparisons of the same treatments. These other comparisons may have differentresults or may help to provide more reliable and precise estimates of theeffects of treatments(Included in Episode 8)

  • Because treatments can have harmful effects aswell as beneficial effects, decisions should not be based on considering only their benefits.Rather, they should beinformed by the balance between the benefits and harms of treatments. Costsalso need to be considered. (Included in all Episodes)

You can download the English version of the podcast via Soundcloud, or listen to it here: https://www.youtube.com/watch?v=_QVdkJIdRA8&list=PLeMvL6ApG1N0ySWBxPNEDpD4tf1ZxrBfv

­

Checklist

We also made a checklist summarising the key messages from the podcast.

In a randomised trial, we evaluated the effects of the IHC podcast on parents’ ability to assess claims about the benefits and harms of treatments.19 In a linked trial, we assessed the effectiveness of IHC primary school resources in improving the ability of children in the fifth year of primary school (age 10–11) to assess treatment claims.20 Participants in the podcast trial and the process evaluation were parents of primary school children in schools in the central region of Uganda which participated in the IHC primary school resources trial. Results from both trials initially showed a large improvement in participants’ ability to assess the trustworthiness of treatment claims. However, follow-up assessments (described elsewhere) revealed that parents’ critical appraisal skills decayed substantially over the following year,21 whereas the children’s or their teachers’ ability did not.22 In that study, skills retention (or decay) was assessed by comparing the scores in the intervention group initially after the intervention and in the same group a year later. These results are reported in greater detail elsewhere.21 The overall goal of the process evaluation was to provide information that could be used to explain the results observed in the trials (impact) and identify other effects not reported in the trial. Whereas randomised trials are useful in answering questions about the effect of an intervention, they may not provide sufficient evidence about how an intervention works in a specific setting, why it causes the effects or not and why interventions might work differently in different contexts. This is even more relevant when considering complex interventions like the IHC media resources which have multiple interacting components. A process evaluation done alongside a randomised trial can provide useful evidence about the implementation process and other factors that contribute to explaining the effects of an intervention.23 24 Some of the text in the background and methods sections of this manuscript reproduces information we have reported in the protocol for this study available elsewhere.18 We reuse it here only to provide clarity to a reader who may not find that information accessible.

The specific objectives of this process evaluation were to:

  1. Assess the fidelity of the intervention (whether it was delivered and used as intended).

  2. Identify factors affecting the implementation and impact, and potentially scaling up of the intervention.

  3. Identify other potential adverse and beneficial effects of the intervention.

The second objective above combines the second and third objectives in the study protocol.18

Methods

As described in detail in the study protocol, this was a multimethod study using qualitative data and quantitative data.18 Our approach is summarised in figure 1. The podcast trial employed 29 research assistants who visited the participants and played the podcast episodes at the participants’ preferred listening venue and time. Participants in the trial could choose whether to listen to the podcast in English or Luganda. At each visit, the research assistants played one or two episodes of the podcast. In addition, all participants were given the complete podcast on MP3 players to play at their convenience. In the podcast group, 288 out of 334 (86%) participants completed the trial. In the control group, which listened to a series of public service announcements about health issues, which were delivered in the same way, 273 out of 341 (80%) completed the trial. Data for the process evaluation were collected from participants in the intervention group who completed the trial. The research assistants recorded when each participant completed listening to each episode, and the number of times each participant reported independently listening to each episode.

Figure 1.

Figure 1

Schematic overview of the process evaluation.

Frameworks underlying this process evaluation

We used three frameworks to guide the collection and analysis of the data. We adapted Carroll and colleagues’ framework for implementation fidelity25 to explore factors related to fidelity (table 1). We developed a framework for factors that could affect the implementation, impact or scaling up the intervention (table 2) by reviewing relevant frameworks for health promotion activities, mass media campaigns, health innovations, health education and guideline implementation;26–31 and the framework that we used in the process evaluation of the IHC primary school resources.32

Table 1.

Considerations for assessing fidelity of the podcast

Domain Factors Explanation
Adherence Delivery of the podcast, MP3 player and checklist The extent to which we delivered the podcast to the parents as planned. Research assistants were to visit participants six times and to play all of the episodes and recaps of previous episodes for the participants. In addition, we gave the participants MP3 players with the podcast, which they could listen to at their convenience. We also gave the participants a checklist summarising the key messages from the podcast.
Listening to the podcast
  •  The number of podcast episodes that parents listened to.

  •  The extent to which participants completed listening to each episode.

Repetition
  •  The number of recaps that participants listened to.

  •  The number of times participants listened to each episode.

  •  Whether and how participants used the checklist.

Table 2.

Factors that could affect the impact of the podcast

Domain Factors Explanation
Intervention Amount of podcast that was heard (fidelity) The extent to which the listener listened to all of the podcast.
Value of the content The extent to which the podcast is valued by the listeners.
Quality of the podcast
Clarity of the podcast The extent to which the language and key messages are clear and understandable.
Length The extent to which the length of each episode and the number of episodes is adequate or too long.
Organisation of the podcast The extent to which the podcast is well organised, including the structure of each episode and the organisation of episodes.
Listening pattern Suitability of the frequency and spacing of the episodes.
Delivery of the podcast The extent to which the type of media used (podcasts delivered by a research assistant) facilitated or hindered listening to the podcasts and reflecting on them.
Appropriateness of the podcast The extent to which podcast is appropriate for the target audience (parents), relevant to them and engages them (including the examples that are used and the stories).
Credibility of the podcast The extent to which the listeners perceive the podcast as credible.
Effort The amount of effort required to listen and learn the key messages.
Entertainment The extent to which the podcast is interesting (does not bore the listeners), is well produced with good sound and presents content in a way that appeals to the listeners.
Target audience Education The extent to which the listener has sufficient background knowledge to understand the key messages.
Attitudes Listener’s attitudes towards learning, towards authorities, towards science or towards critical thinking.
Listeners expectations The extent to which what listeners are expecting (eg, expecting to be told what to do) affects their ability to understand the key messages.
Beliefs Listener’s beliefs about the content (eg, what treatments work or the concepts) or beliefs that are in conflict with the content.
Motivation to listen and learn Listener’s motivation to listen and learn.
Preferences or experiences* Listener’s preferences for or experiences with healthcare generally or specific types of healthcare and information about treatments that influences the listener’s interest, attitudes or beliefs.
Self-efficacy* The extent to which the listener feels competent and confident about being able to learn and use the messages.
Access to healthcare and information about treatments* Availability or unavailability of healthcare generally or specific types of healthcare and information about treatments that influences the listener’s interest, attitudes or beliefs.
Environment Child’s school environment The extent to which their children’s school influenced their attitudes towards the podcast.
Listening environment and technology The extent to which there were distractions, good acoustics, other listeners that helped or hindered listening and the technology used to play the podcasts functioned appropriately.
Competing messages The extent to which other messages in the media are in conflict with or reinforce the messages and examples used to illustrate the messages.
Time constraints The extent to which there is sufficient time to listen to the podcast.
Access to the podcast The extent to which the research assistants delivering the podcasts in the trial facilitated or hindered listening to the podcasts and reflecting on them.
Listening pattern The extent to which the frequency of visits and the number of episodes listened to each visit facilitated or hindered listening to the podcasts and reflecting on them.
Competing priorities The extent to which other priorities limit listening to the podcast and reflecting on the key messages.
Attitudes and beliefs of others Attitudes or beliefs of family, friends, neighbours, colleagues, authorities or others that influence the listener’s interest in the key messages.
Political environment Elements of the political environment that affect listening to the podcast and learning the key messages; for example, the extent to which the political environment discourages or encourages questioning of information and ideas.

We developed a list of potential adverse and beneficial effects for the third framework (table 3). That list was based on pilot and user testing of the podcast and the IHC primary school resources, discussions with other researchers about potential benefits and harms, and wider discussions about the benefits and harms of interventions to promote evidence-informed decision-making.

Table 3.

Potential adverse and beneficial effects of the podcast

Potential adverse effects Corresponding beneficial effects
  • Distrust of health professionals or conflict between participants and health professionals

  • Appropriate questioning of health professionals, better understanding and better healthcare.

  • Conflict between religious beliefs and scientific principles

  • Engagement of participants and others in discussion about religious beliefs and science.

  • More difficult decision-making about healthcare

  • More thoughtful and informed decisions about healthcare.

  • Nihilism or cynicism

  • Healthy scepticism and appreciation of science.

  • Anxiety or discomfort with uncertainty

  • Understanding and acceptance of uncertainty.

Other potential beneficial effects
  • Impacts on children or others

The podcast might indirectly improve children’s understanding and ability to apply the concepts being learnt by the parents or the podcast might be shared with others in the household or other contacts of the study participants.
  • Awareness of the basis for claims about treatment effects

Participants becoming more aware and thinking critically about the basis for claims about treatment effects.
  • Attitudes and behaviours towards evidence of treatment effects

Participants desiring and asking for evidence supporting claims about treatment effects.
  • Awareness, attitudes and behaviours in relation to other types of causal claims

Participants becoming more aware and thinking critically about the basis for causal claims not related to treatments, and desiring and asking for evidence supporting those claims.
  • Questioning more

Participants asking more questions and not taking things for granted.
  • Engagement in informed discussions about policies

Participants becoming more engaged in discussions about health policies, and desiring and asking for evidence supporting claims about health policies.
  • Impacts on other types of decisions

Participants making more thoughtful and informed decisions about interventions or activities that are not related to health.

Qualitative data collection

We included participants who chose to listen to the podcast in either English or Luganda. To capture the opinions, views and experiences of a wide range of participants, we purposively sampled parents according to education level (primary, secondary and tertiary), and whether their children were in a school that was in the intervention or control arm of the IHC primary school trial.33

We used a variety of methods to collect data, including brief semistructured interviews during the intervention, in-depth post-intervention interviews, observations and focus group discussions. We pretested all data collection tools and research assistants received training on methods for qualitative data collection. We conducted mock interviews among investigators and research assistants to familiarise ourselves with the interview questions and to ensure consistency among interviewers and across questions.

Post-episode and post-intervention interviews with parents

At the end of each visit, the research assistants conducted brief semistructured interviews with parents. Using an episode evaluation form,33 they asked them for their immediate perceptions about the episode. After participants had listened to all of the episodes, we conducted in-depth interviews with some of them. These in-depth interviews were recorded and transcribed.

Observations

The research assistants delivering the podcast recorded observations made at each visit in a study log, which were discussed at weekly meetings. The principal investigators also kept a notebook where they recorded observations from field visits, informal consultations, weekly meetings and other contacts with participants and research assistants during and after the trial.

Focus group discussions with parents and research assistants

We conducted a series of focus group discussions, with four to six participants in each group. Each group was moderated by a facilitator using a guide33 and assisted by an observer who took notes. These were also recorded and transcribed. We conducted one focus group discussion with the research assistants to explore their experiences delivering the podcast and their interactions with parents.

Interviews with the lead investigators

DS and AN were responsible for implementing the intervention. Given the importance of their role in the trial and the process evaluation, two of the other investigators (CG and SL) interviewed them to explore their thoughts and experiences and how these may have influenced decisions they made in the process evaluation.

In total, we conducted 84 brief semistructured interviews at the end of visits during the intervention; 20 in-depth postintervention interviews; two focus group discussions with parents; one focus group discussion with research assistants and two in-depth interviews with the principal investigators. The number of interviews was largely pragmatic. We made a judgement, based on the emerging data, about whether more interviews or focus groups were needed. In making this judgement, we considered the variation in issues emerging from the interviews and focus groups, and the extent to which we are able to explain these variations. We planned not to conduct more than 30 in-depth interviews and six focus group discussions, mainly because of time and resource constraints.34 35

Data analysis

To assess fidelity, we computed the proportion of participants who listened to each episode, among those who completed the IHC podcast trial evaluation tool. We used logistic regression to explore the relationship between listening frequency and participants’ scores on the test used as the primary outcome measure in the trial.

To analyse the qualitative data, we used a framework thematic analysis approach, guided by the three frameworks described above.36 This approach includes four stages: familiarisation, coding, charting and interpretation of the data. We applied all three frameworks to the data described above. Two of the investigators (DS and AN) independently read and reread the transcripts from the interviews, focus groups and observations. They then coded the data until all the transcripts had been reviewed. For each framework, the definitions and boundaries of each of the frameworks’ factors were discussed among the investigators, and the frameworks were revised in line with categories that emerged from the data. We then charted the data by writing a summary that distilled the findings for each framework factor. Finally, using the summarised data, we explored the range and nature of findings, grouping them into broader themes and looked for possible explanations.

We summarised the key findings and assessed our confidence in each important finding using the GRADE-CERQual approach, a transparent method for assessing the confidence in evidence from reviews of qualitative research.37 The full form of GRADE is: Grading of Recommendations, Assessment, Development and Evaluations, while that of CERQual is: Confidence in the Evidence from Reviews of Qualitative Research. When applying the GRADE-CERQual approach, we assess four components: methodological limitations, data adequacy, coherence and relevance as explained below.

  • Methodological limitations: ‘The extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding’.

  • Data adequacy: ‘An assessment of how clear and cogent the fit is between the data from the primary studies and a review finding that synthesises that data. By “cogent”, we mean well supported or compelling’.

  • Coherence: ‘An overall determination of the degree of richness and quantity of data supporting a review finding’.

  • Relevance: ‘The extent to which the body of evidence from the primary studies supporting a review finding is applicable to the context (perspective or population, phenomenon of interest, setting) specified in the review question’.

Although CERQual has been designed for findings emerging from qualitative evidence syntheses, several components of the approach are suitable for assessing findings from a single study with multiple sources of qualitative data.

We used a logic model to organise the findings of the process evaluation with the findings of the trial. Firstly, DS and AN organised the findings into chains of events that might have led to the outcomes of the trial and additional outcomes that were explored (table 3). Findings and outcome measures were categorised as attributes of the intervention, effect modifiers, intermediate outcomes, and observed and potential effects. We organised these elements into chains of events, discussed them and revised them iteratively until there was agreement on a final model.

Patient and public involvement

We had an advisory panel made of members of the public who deliberated and advised on different aspects of the study implementation. In the design of the intervention (the IHC podcast), the public provided feedback which we used to improve the design of the podcast. Some participants helped in the recruitment by inviting their colleagues to recruitment meetings. The results of these studies will be disseminated to each group of parents at the schools where they were recruited from.

Results

The main findings, including our confidence in each finding, are summarised in table 4, and organised into a logic model in table 5.

Table 4.

Summary of the main qualitative findings

Summary of the main findings Methods and/or data sources contributing to study finding CERQual assessment of confidence in the findings Explanation of CERQual assessment
Value of the content
All those interviewed found the IHC podcast to be valuable. They felt that it provided relevant information and new knowledge and skills for assessing health information.
Two focus group discussions, 15 individual interviews and responses to the test completed immediately after listening to the podcast. Moderate Minor concerns regarding methodology, relevance, coherence or adequacy of the data.
Quality of the podcast
Clarity of the podcast
 The podcast was clear and understandable to people in the target audience for which it was prepared.
  •  Offering the parents the podcast in their first language—Luganda accorded the podcast more clarity.

Two focus group discussions and 16 individual interviews Moderate Minor concerns regarding methodology limitations, relevance, coherence or adequacy of the data.
  • Listeners felt that the explanations that the IHC podcast provided were clear and sufficient and that any questions they had were answered by the end of each episode.

All three focus group discussions and 18 individual interviews High Very minor concerns regarding methodology limitations, relevance, coherence or adequacy of the data.
Length of the episodes and podcast
  •  For the most part, participants felt that the length of the podcast episodes and the number of episodes was appropriate.

One focus group discussion and 14 individual interviews Moderate Minor concerns regarding methodology limitations, no concerns regarding relevance. No concerns regarding coherence, moderate concerns regarding adequacy of data.
  • When participants complained about the length for the most part, it was because of the perception that the episodes were long. Long episodes could have influenced how some participants understood the message of the podcast.

One focus group discussion and seven individual interviews Low Moderate methodology limitations (data are from individual interviews only), No concerns regarding relevance. No concerns regarding coherence, moderate concerns regarding adequacy of data.
Organisation of the podcast
  •  Participants felt that the podcast was well organised, although the reasons that they gave for this varied.

Two focus group discussions and six individual interviews Low Minor concerns regarding methodology. No concerns regarding relevance. Moderate concerns regarding coherence, serious concerns regarding adequacy of data.
Listening pattern
  •  The majority of participants found it suitable to listen to two episodes per week for about 7 weeks when visited by the research assistants, and to be able to listen to the podcast at their convenience after that.

Nine individual interviews Low Moderate concerns regarding methodology (data from only individual interviews). No concerns regarding relevance or coherence but there are moderate concerns regarding the adequacy of data. Data were from less than half of the interviewees.
  • Episodes were well spaced. Listening to the podcast once a week was sufficient.

11 individual interviews Low Moderate concerns regarding methodology limitations (data from individual interviews only). No concerns regarding relevance or coherence but there are moderate concerns about the adequacy of data. Data were from slightly more than half of the interviewees.
Delivery of the podcast
  •  A podcast delivered by research assistants facilitated listening to the entire podcast and reflecting on it by making it convenient to listen and providing personal support. It also made it possible for others (family and neighbours) to listen to the podcast together with the participants.

All three focus group discussions and almost all (17) individual interviews High Very minor concerns regarding methodology limitations, relevance, coherence or adequacy of data.
Child’s school environment
  •  Some parents, whose children were in intervention schools, were motivated to participate by their children and wanting to learn what their children were learning.

Two focus group discussions and 11 individual interviews Moderate Minor concerns regarding methodology limitations, no concerns regarding relevance. No concerns regarding coherence, moderate concerns regarding adequacy of data.
  • Parents were motivated to participate by headteachers and teachers, whom they trusted.

Two focus group discussions and 11 individual interviews Moderate Minor concerns regarding methodology limitations, no concerns regarding relevance. No concerns regarding coherence, moderate concerns regarding adequacy of data.
  • Few or no parents attended meetings or were recruited to participate at some schools.

Observations from investigators’ notes Low Serious concerns regarding methodology limitations, moderate concerns regarding relevance. No concerns regarding coherence, moderate concerns regarding adequacy of data.
Education of the target audience
  •  In general, parents’ level of formal education did not appear to influence how they listened to the podcast or their overall understanding of the podcast.

One focus group discussion, 10 individual interviews and parents’ scores on a test completed immediately after listening to the podcast. Moderate Minor concerns regarding methodology (data from one focus group discussion, 10 interviews and quantitative results from the test completed immediately after listening to the podcast). No concerns regarding relevance or coherence but there are minor concerns regarding the adequacy of data as most of it came from 10 interviews.
  • Participants’ level of formal education and comfort with numbers may have had an impact on their understanding of Key Concepts that small studies and single studies can be misleading.

One focus group discussion and three individual interviews Low Serious concerns regarding methodology (data are from three individual interviews and one FGDs), no concerns regarding relevance. No concerns regarding coherence but there are serious concerns regarding adequacy of data.
Participants’ attitudes
  • Participants had positive attitudes towards learning new information, science and critical thinking.

Two focus group discussions and 19 individual interviews Moderate Minor concerns regarding methodology, relevance, coherence or adequacy of data.
Listening environment and technology
  •  Most participants did not encounter difficulties while listening to the podcast. A quiet listening environment and making sure that the batteries in the portable media player are charged could help prevent interruptions and facilitate listening.

Two focus group discussions and almost all (17) individual interviews Moderate Minor concerns regarding methodology, relevance, coherence or adequacy of data.
  • Having a mechanism (MP3 player) that allowed participants to store and listen to all the episodes again in their convenient time enabled the parents to listen more frequently and at their own convenience.

Two focus group discussions and 17 individual interviews and quantitative results Moderate Minor concerns regarding methodology, relevance, coherence or adequacy of data.
  • Participants who were in a busy and noisy place found difficulties listening, which might have affected how they listened and understood the content of the podcast.

Observations from two focus group discussions, 11 individual interviews and investigators notes. High Very minor concerns regarding methodological limitations, relevance, coherence or adequacy
Participants’ expectations
  •  Some participants expected to hear messages about how to manage common health conditions rather than messages about how to assess the trustworthiness of treatment claims. Nonetheless, most participants understood the purpose of the podcast after listening to it and most listened to the entire podcast.

18 individual interviews, one focus group discussion and investigators’ observation notes. High Very minor concerns regarding methodological limitations, relevance, coherence or adequacy
Participants’ beliefs
  •  Many participants had prior beliefs about treatments that were in conflict with messages in the IHC podcast. This did not appear to interfere with their listening to the podcast but might have affected their understanding of the podcast.

Two focus group discussions and almost all 20 individual interviews High Very minor concerns regarding methodology, relevance, coherence or adequacy of the data.
  • Some of the participants’ beliefs persisted after listening to the podcast.

Two focus group discussions and four individual interviews Low Moderate concerns regarding methodology (data are from individual interviews and FGDs), no concerns regarding relevance or coherence, but there are serious concerns regarding adequacy of the data.
Appropriateness of the podcast
  •  Parents found the podcast to be relevant and engaging.

Two focus group discussions and 16 individual interviews Moderate Minor concerns regarding methodology, relevance, coherence and adequacy of the data.
Credibility
  •  Participants found the podcast to be credible. Most of the credibility was related to the high quality of production, believable messages and that it was produced by a reputable organisation.

Two focus group discussions and 14 individual interviews Moderate Minor concerns regarding methodology, relevance, coherence or adequacy of data.
Effort
  •  Participants felt that the podcast required very little effort to listen to.

One focus group discussion and eight individual interviews Moderate Minor concerns regarding methodology, relevance, coherence or adequacy.
Entertainment
  •  All those interviewed found the IHC podcast and song to be entertaining and engaging. The skits made the explanations non-threatening, facilitated understanding and made the messages memorable.

Two focus group discussions and almost all individual interviews High Very minor concerns regarding methodology, relevance, coherence or adequacy.
Motivation to listen and learn
  •  Key factors that motivated participants to listen to the podcast included the perceived value of what they were learning, its practical application to daily life, and that the podcast was entertaining and enjoyable.

Two focus group discussions and almost all individual interviews. High Very minor concerns regarding methodology, relevance, coherence and adequacy of data.
Competing messages
  •  Participants listened to competing messages, but those messages did not appear to have influenced how they listened to the podcast.

13 individual interviews Low Moderate concerns regarding methodology (data from individual interviews only). No concerns regarding relevance or coherence but there are moderate concerns about the adequacy of data.
  • Listening to the IHC podcast led participants to be more critical and aware of health advice that was given in other messages without providing a basis for the advice.

Two focus group discussions and seven individual interviews Moderate Minor concerns regarding methodology (data from individual interviews only). No concerns regarding relevance or coherence but there are moderate concerns about the adequacy of data having come from only seven interviews.
Adverse and beneficial effects
  •  Listening to the IHC podcast led some participants to question more and be more critical of claims unrelated to health and treatments.

Two focus group discussions and eight individual interviews. Moderate Minor concerns regarding methodology. No concerns regarding relevance or coherence. Moderate concerns regarding adequacy of the data.

FGDs, focus group discussions; IHC, informed health choices.

Table 5.

Logic model for the factors influencing implementation and effect of the intervention

The IHC podcast intervention Effect modifiers Intermediate effects Desirable effects
Facilitators
Factors that facilitated implementation and potential desirable effects
Value of the podcast
  • All those interviewed found the IHC podcast to be valuable. They felt that it provided relevant information and new knowledge and skills for assessing health information.


Quality of the podcast
  • The podcast was clear and understandable to people in the target audience for which it was prepared.

  • Listeners felt that the explanations that the IHC podcast provided were clear and sufficient and that any questions they had were answered by the end of each episode.

  • For the most part, participants felt that the length of the podcast episodes and the number of episodes was appropriate.

  • Participants felt that the podcast was well organised, although the reasons that they gave for this varied.


Delivery of the podcast
  • A podcast delivered by research assistants facilitated listening to the entire podcast and reflecting on it by making it convenient to listen and providing personal support. It also made it possible for others (family and neighbours) to listen to the podcast together with the participants.

  • The majority of participants found it suitable to listen to two episodes per week for about 7 weeks when visited by the research assistants, and to be able to listen to the podcast at their convenience after that.

Child’s school environment
  • Some parents, whose children were in intervention schools, were motivated to participate by their children and wanting to learn what their children were learning.

  • Parents also were motivated to participate by headteachers and teachers, whom they trusted.


Education of the target audience
  • Participants’ level of formal education did not appear to influence how they listened to the podcast or their overall understanding of the podcast but may have affected the extent to which they retained what they learnt.


Participants’ attitudes
  • Participants had positive attitudes towards learning new information, science and critical thinking.


Listening environment and technology
  • Most participants did not have a problem listening to the podcast. A quiet listening environment and making sure that the batteries in the portable media player are charged could help prevent interruptions and facilitate listening.

Appropriateness of the podcast
  • Participants found the podcast to be relevant and engaging.


Credibility of the podcast
  • Participants found the podcast to be credible.


Effort
  • It required very little effort to listen to the podcast.


Entertainment
  • All those interviewed found the IHC podcast and song to be entertaining and engaging. The skits made the explanations non-threatening, facilitated understanding and made the messages memorable.


Motivation to listen and learn
  • Key factors that motivated participants to listen to the podcast included the perceived value of what they were learning, its practical application to daily life, and that the podcast was entertaining and enjoyable.


Competing messages
  • Listening to the IHC podcast led some participants to be more critical and aware of health advice that was given in other messages without providing a basis for the advice.

Observed effects
  • Parents who listened to the IHC podcast in the trial were better able to assess the trustworthiness of treatment claims, compared with parents in the control group.

  • After 1 year, there was a large relative reduction in the ability of participants to assess the trustworthiness of treatment claims among participants who listened to the IHC podcast compared to those who did not.


Potential effects
  • Listening to the IHC podcast led some participants to be more critical and aware of health advice that was given in other messages without providing a basis for the advice.

  • Listening to the IHC podcast led some participants to become more thoughtful about claims not related to health.

Factors that could facilitate scaling up
  • A well-designed podcast may appeal to many people in the target audience and be convenient.

Factors that could facilitate scaling up
  • Introducing the IHC podcast through primary schools that are using the IHC primary school resources may be an effective strategy for disseminating the podcast to many parents and others in the community.

Barriers
Factors that impeded implementation and potential desirable effects
Length of the episodes
  • Some episodes were reportedly long, which makes them confusing.

Child’s school environment
  • Few or no parents attended meetings or were recruited to participate at some schools. The reasons for this are uncertain.

Observed effects
  • No adverse effects were reported by participants or observers in the trial.

Participants’ expectations
  • Some participants expected to hear messages about how to manage common health conditions rather than messages about how to assess the trustworthiness of treatment claims. Nonetheless, most participants understood the purpose of the podcast after listening to it and most listened to the entire podcast.


Participants’ beliefs
  • Many participants had prior beliefs about treatments that were in conflict with messages in the IHC podcast. This did not appear to interfere with their listening to the podcast but might have affected their understanding of the podcast.


Listening environment and technology
  • Participants who were in a busy and noisy place found difficulties listening, which might have affected how they listened and understood the content of the podcast.

Potential effects
  • Some participants mentioned that there might be a potential for scientific information to conflict with traditional and religious beliefs. However, we did not observe any conflicts, and no participant reported having experienced these as a result of listening to the IHC podcast.

Factors that could impede scaling up
  • Delivery of the podcast by research assistants is not feasible on a large scale.

Factors that could impede scaling up
  • The ability to reach parents through schools may depend on how much interest and enthusiasm is shown by head teachers and teachers. This, in turn, may depend on effective outreach to introduce the IHC podcast together with the IHC primary school resources into schools.

  • Many people in the target audience (parents of primary school children) may not initially be interested in learning new information, science and critical thinking.

  • Availability of portable listening devices may limit dissemination of the podcast.

IHC, informed health choices.

Fidelity

Almost all participants (99.7%) who completed the trial listened to all the episodes as intended (online supplementary additional file 1). They listened to the podcast on their own an average of 2.2 times per day (SD 1.1) for an average of 4.6 days (SD 2.1) (figures 2 and 3). Participants’ scores on the test used to measure their ability to assess the trustworthiness of treatment claims were associated with the number times per day (OR 1.3; 95% CI 1.2 to 1.4; p<0.001) and the number of days (OR 1.2; 95% CI 1.2 to 1.3; p<0.001) that they listened to the podcast on their own (figure 4).

Figure 2.

Figure 2

Times per day that participants listened to the podcast.

Figure 3.

Figure 3

Number of days that participants listened to the podcast.

Figure 4.

Figure 4

Explanatory factors: test score by listening frequency.

Supplementary data

bmjopen-2019-031510supp001.pdf (54.7KB, pdf)

Factors affecting the implementation, impact and scaling up of the intervention

Findings related to the intervention

All those interviewed described the podcast as valuable. They reported that it was informative and improved their knowledge about assessing health information, and their confidence in challenging wrong beliefs and claims about treatments. Some noted it gave them confidence to discuss health issues with health workers, while others described how it taught them to be more careful in making choices about treatments.

Almost all those interviewed described the podcast as clear. They attributed this to the language—including the dialect, the vocabulary, the presentation style, the familiar setting of the scenarios and illustrations used and the organisation of the content within each episode. Some participants noted that being able to listen to the podcast in Luganda was helpful because it was the language they understood best. They also noted that technical jargon was introduced and discussed in a manner that made it accessible to people with limited or no formal education, and to people without prior experience with the conditions being discussed. They mentioned that within each episode, the organisation of the content made it easier to follow the explanations and key messages, while the friendly demeanour of the characters in the stories made the podcast more understandable and enjoyable. The detailed explanations, the reiteration of the key messages at the end of each episode, and the length of the episodes all reportedly contributed to the podcast’s clarity.

All participants reported that the length of the podcast in terms of number of episodes was appropriate. While the episodes varied in length, most participants described the episodes as being of appropriate length, although some participants expressed discontent with episodes they perceived to be short. Research assistants, on the other hand, observed that participants sometimes became tired and seemed bored when they listened to long episodes.

Almost all participants reported that they were able to listen to all the episodes because the research assistants were diligent in visiting and playing the episodes to them. Some noted that the research assistants played back the episodes whenever the participants needed to listen to them again, while others reported asking the research assistants questions about the project, to which they got timely responses.

Participants reported that the organisation of the episodes made it easy for them to follow. This included how each episode could stand alone with a complete message; how the series starts with easier concepts and how the series includes recaps of previous episodes. These were described as attributes that made it easier for them to learn.

The majority of participants reported listening to at least two episodes per week for about 7 weeks using the portable media players. Most said this listening pattern was appropriate, even for those who had busy schedules. Some participants reported continuing to listen to the podcast on their own until they completed the test.

Potential effect modifiers

The participants’ education level ranged from no formal education to tertiary education, with the majority having completed no more than primary school. Most of those who had tertiary education were teachers.

Some participants reported that there were some messages that they would have understood better if they had had more knowledge of mathematics. Specifically, they reported difficulty understanding the Key Concept that small studies can be misleading. However, many of the parents whom we interviewed reported that their level of education did not have a big influence on how they understood the general message of the podcast. This is consistent with there not being a clear association between level of education and the size of effect immediately after the parents listened to the podcast.19

The schools facilitated meetings between parents and the research team. They provided meeting venues and took time off their school programme for the teachers and parents to meet the research team. This collaboration likely gave credibility to the project. It also facilitated engagement and recruitment of parents for the podcast trial. Some schools encouraged children to share with their parents what they were studying, which reportedly raised their parents’ curiosity and interest in the project. A good number of parents, who attended meetings with the research team, mentioned that they were eager to learn more about what their children were learning, and how they too could learn. However, many parents of children in the primary school trial, especially fathers, did not attend any meetings; and many who did declined to participate. The reasons for this are uncertain.

Almost all participants showed positive attitudes towards learning, science and critical thinking. A few participants, however, expressed discomfort with having to think a lot about treatment options and making choices.

Most participants reported that the portable media players and MP3 players facilitated listening at their convenience, and they did not have major problems using them. Some participants found it helpful for the research assistant to operate the portable media players for them.

Participants were informed that they would be listening to health messages. Some reported that they expected to listen to messages about common health conditions and how to manage them, rather than to messages about how to assess the trustworthiness of treatment claims. Nonetheless, we observed that most participants seemed to understand the purpose of the podcast after listening to it, and most listened to the entire podcast.

Some participants voiced strong beliefs about which treatments are effective, mostly from their own personal experiences, and remained steadfast in those beliefs even after listening to the IHC podcast, even when they were in conflict with a podcast message. Despite having such conflicting beliefs, participants continued to listen to the episodes until the end of the trial.

“Some claims are trustworthy. For example, if a child gets a burn, you simply have to get cooking oil, apply it to the burn wounds, then apply sugar. If you instead apply cold water the child’s burns wounds will develop blisters. If the blisters rupture, we usually apply ash from burnt sisal. That’s all you need to do, and the child will get better.”

Intermediary effects

Almost all participants described the podcast as appropriate for them. They described the stories, examples (conditions, treatments and claims) and the explanations as appropriate and relevant. A few participants—mostly those who had strong beliefs about the examples of treatments—reported that some of the content was inappropriate. For example, some participants argued that using cold water as a first aid treatment of burns was not right and that this example was not helpful.

Participants described the podcast as credible. They attributed this to the quality of the content, the research team and the podcast’s source (Makerere University, the largest and oldest medical school and health research institution in the country).

Most participants reported that they found it easy to listen to the podcast. They reported that it did not take a lot of time and they could listen to it at their own convenience, even while doing other daily activities.

Participants said the podcast was entertaining, informative and engaging. They described how the use of stories made the podcast attractive and non-threatening, and made the explanations easier to understand. Some noted that after listening to one message, the content of that message enticed them to listen to the next one. Participants said the quality of production was good, the content of the episodes was engaging and the song and stories were memorable.

Participants described several motivations for listening: that the podcast was valuable, entertaining and enjoyable and that the information in each episode was relevant and applicable to their lives. Some participants referred to their love for science and health information, or to their personal position and responsibilities in society. Some said they were learning new information and gaining new skills to enable them to understand health claims.

”What motivated me personally was that I was getting exposed to what I had not known before. Also, there was some information we were relying on which now I know was hearsay, but the episodes gave us new knowledge to reflect on what we were hearing.”

During the intervention, participants heard other messages on the radio or television, or by word of mouth. Their engagement with the podcast does not seem to have been affected by other competing messages. On the other hand, some participants reported listening more critically to other health messages.

​”The other messages did not interfere with my listening. I had already learned through these messages how someone should arrive at a decision of what treatments to use, so every time we heard a message over the radio we compared what they were saying to what the IHC message said. We started asking if the messages on the radio were trustworthy or whether they were just interested in selling their medicines. So, we compared using the knowledge and skills we learned from the IHC messages.“

Most participants mentioned that the messages in the IHC podcast were not necessarily in conflict with other messages. However, they said the IHC podcast was different because it included sufficient background information that enabled one to learn how to make choices. In contrast, some other health messages were viewed as aiming to convince people to use their intervention:

“The difference was that in these (IHC) messages they would give us the good side and the bad side of using certain treatments and encourage us to decide on our own. Other health messages give you only the good side, that if you use this (treatment) you will get cured. These messages taught me that everything can have a good side and bad side. This led me to start thinking more deeply about certain information that we are always being given by others. Why do they only talk about the good side?”

Beneficial and adverse effects

As noted above, a potential effect of listening to the IHC podcast for some parents was being more critical and aware of unreliable health advice. Additionally, some participants reported having learnt to question more, and to think more critically about claims unrelated to health. Some participants mentioned that scientific information could potentially be in conflict with cultural or religious beliefs. However, no participant reported experiencing these conflicts as a result of listening to the IHC podcast. We elicited other potential additional effects using the probes in table 3. However, no other potential beneficial or adverse effects were reported by participants or observers in the trial.

Discussion

Factors that facilitated implementation and effectiveness

The podcast intervention had a large effect initially, with almost twice as many parents in the intervention group having a passing score on the test used to measure their ability to assess treatment claims, compared with the parents in the control group.19 After 1 year, the proportion of parents with a passing score on the same test decreased by one third.21 We found a number of factors that help to explain the initial effectiveness of the intervention. However, because we collected data for the process evaluation during and shortly after the trial, our findings do not help to explain the subsequent decrease.

Almost all participants who completed the study listened to all the episodes. This was, at least in part, because research assistants delivered the podcast to the parents on portable media players and listened to the podcast with them. In addition, providing the participants with MP3 players enabled the participants to listen to each episode more than once, and most participants did so. This almost certainly contributed to the initial effectiveness of the intervention. Moreover, we found associations between the number of times per day and the number of days that participants listened to the podcast and their initial test scores, suggesting a dose–response relationship.

More passive dissemination of the podcast likely would be less effective. On the other hand, the cost of passive dissemination would be substantially less, and the effectiveness would likely be the same for those who choose to listen to the entire podcast.

Participants valued the podcast because it provided them with new knowledge and skills for assessing health information. They also felt that it was clear, understandable and well organised. Although some participants found some of the episodes too long and confusing, most found the length of the episodes appropriate. They also found the duration of the intervention (about 7 weeks) and the intensity (about two episodes per week) suitable. All these attributes of the intervention are likely to have contributed to its initial effectiveness.

Parents were motivated to participate by headteachers and teachers, whom they trusted. Some parents, whose children were in intervention schools in the IHC primary school trial,20 were motivated by wanting to learn what their children were learning.

For the most part, participants education level did not appear to affect their motivation, how they experienced the podcast, or the initial effectiveness of the intervention.19 However, it may have affected retention of what was learnt. Participants with tertiary education retained more of what they learnt than those with primary or no formal education. Many of the participants with tertiary education were teachers, and this might partially explain that finding. A large proportion of teachers in both intervention and control schools had passing scores on the test initially and after 1 year,22 compared with the parents overall.21

Participants had positive attitudes towards learning new information, science and critical thinking. Their positive attitudes likely contributed both to their participating in the trial and to the effectiveness of the intervention. Parents without similar attitudes would be less likely to listen to the podcast and less likely to benefit from listening.

Intermediary effects of the intervention, which contributed to its effectiveness, are largely related to the participants’ experience of the podcast. They found the podcast to be relevant, engaging, credible, easy to listen to and entertaining. These factors motivated them to listen to the podcast and to learn. Moreover, for at least some of the participants, it motivated them to think more critically about treatment claims that they encountered.

Factors that impeded implementation and effectiveness

We identified three factors that may have impeded implementation of the intervention and its effectiveness. First, few or no parents attended meetings or were recruited to participate at some schools. Although this did not affect the effectiveness of the intervention among participants, it is a major impediment to scaling-up the intervention.

Second, many participants had prior beliefs about treatments that were in conflict with the key messages of the IHC podcast. Some of those beliefs persisted after listening to the episodes. Frequently, these conflicting beliefs were based on personal experiences using a treatment (anecdotal evidence). This finding is similar to what was found in the process evaluation of the IHC primary school intervention.38 In that evaluation, conflicting beliefs of the children were often based on personal experiences, whereas conflicting beliefs of teachers were more often based on tradition (treatments that had been widely used for a long time). It is uncertain whether those with conflicting beliefs were less likely to answer questions related to those Key Concepts (Box 1) correctly than those who did not have conflicting beliefs. However, strongly held beliefs may be resistant to change and this could make it difficult to learn new concepts that are in conflict with those beliefs.39–41

Third, some participants expected to hear messages about the causes and management of common health conditions, rather than messages about how to critically assess the trustworthiness of treatment claims. This could have influenced how they perceived and understood the IHC messages. While this was a problem during the development and early phase of the trial, most participants understood the purpose of the podcast after listening to it, and most listened to the entire podcast.

Factors that might influence scaling up

We identified the following factors that could facilitate scaling up the use of an educational podcast to enable parents to assess the trustworthiness of treatment claims:

  • A well-designed podcast may appeal to many people in the target audience and be convenient.

  • Introducing the IHC podcast through primary schools that are using the IHC primary school resources may be an effective strategy for disseminating the podcast to many parents and others in the community.

  • Ensuring that the podcast is relevant (by using claims that are relevant to the target audience to illustrate the Key Concepts) and that it is entertaining and easy to listen to (by pilot and user testing it) can help to motivate people in the target audience to listen to it.

We identified the following factors that could impede scaling up use of the podcast:

  • Delivery of the podcast by research assistants, which likely contributed to the effectiveness of the intervention, is not feasible on a large scale.

  • Providing parents with portable media players and MP3 players also likely contributed to the effectiveness of the intervention. Access to these devices may limit dissemination of the podcast.

  • The ability to reach parents through schools may depend on how much interest and enthusiasm is shown by head teachers and teachers. This, in turn, may depend on effective outreach to introduce the IHC podcast together with the IHC primary school resources into schools.

  • Many people in the target audience (parents of primary school children) did not attend recruitment meetings and many of those who did attend chose not to participate in the trial. This might be due to many parents not being interested initially in learning about health, science and critical thinking; busy work schedules; or problematic relationships between parents and school authorities.

Potential beneficial and adverse effects of the podcast

Some participants reported that listening to the IHC podcast led them to become more critical and aware of health advice that was given without a basis. This is consistent with the finding in the 1-year follow-up study that parents in the podcast group were more likely to have been sceptical of the last treatment claim that they had heard.21 However, the proportion of participants who responded that they thought about the basis for that claim that they heard was lower in the podcast group than in the control group. The reasons for this are unclear, and it is uncertain how many participants became more critical of treatment claims initially. Nonetheless, whatever effect the intervention had on participants disposition to think critically about treatment claims initially, the intervention appears unlikely to have had a long-term beneficial effect on the disposition of most participants.

Some participants mentioned that there might be a potential for scientific information to conflict with traditional cultural and religious beliefs. However, we did not observe any conflicts, and no participant reported having experienced any as a result of listening to the podcast.

Results in relation findings from other studies

We found only one systematic review that explored factors that influence impact of interventions for improving critical thinking. Abrami and colleagues found instructional intervention for critical thinking can have a positive effect and that the content, style of teaching (pedagogy) and collaboration among learners can influence the impact.42 Our findings are consistent with those of Abrami and colleagues. Although our study did not allow for collaboration among learners as part of the intervention, we found that the nature of content, and how the intervention was delivered likely influenced the impact of the intervention.

Strengths and limitations

Strengths of this study include the use of multiple methods, including a survey of all of the participants in the intervention arm who completed the trial, observations, brief interviews, in-depth interviews and focus group discussions. We used the CERQual approach to make explicit judgements about our confidence in each finding (table 4), and our confidence in most of the findings was moderate or high. We organised those findings in a logic model (table 5), which helps to explain the initial findings of the trial, as well as potential facilitators and impediments to scaling up use of the podcast.

The largely positive findings reflect the value of the iterative, human-centred design approach that we used to develop the podcast.17 43 44 The design of the podcast identified and addressed problems with how people in our target audience experienced earlier versions of the podcast, resulting in a podcast that participants in the trial experienced positively.

An important limitation of this study is that all of the data were collected before the results of the 1-year follow-up study were available. Consequently, we did not ask questions specific to why the ability of participants to think critically about treatment claims decreased substantially after 1 year. We used other available data form the nature of the intervention and how it was implemented to explain this observation. A better way would have been to interview participants about skills decay after the analysis of data from the follow-up evaluation.

Another important limitation of this study is that the investigators were responsible for both developing and evaluating the intervention. This could have led us to emphasise participants’ positive experiences of the intervention when collecting and analysing the data. In addition, the participants were aware that the lead investigators (DS and AN) were responsible for the intervention itself. Therefore, there may have been a social desirability bias, due to participants providing responses that would be pleasant to the investigators.45 We tried to address these biases by publishing the protocol for the process evaluation in advance,18 critically reviewing our interpretation of the data, facilitating reflection by interviewing the lead investigators and making it clear to the participants that we were evaluating the podcast and not them. Nonetheless, we cannot rule out that our interests as developers of the intervention influenced the findings of this process evaluation.

Our use of the GRADE-CERQual approach to assess our confidence in findings from an individual study rather than findings from a systematic review was novel, but worked reasonably well. However, the fact that we applied the approach to our own data was a limitation. In future assessments, we recommend that external assessors are involved.

Conclusions

The findings of this process evaluation support the value of the human-centred design approach used to develop the podcast, which contributed to the initial effectiveness of the podcast. However, they do not help to explain the decrease in the effectiveness of the intervention after 1 year. Future research should explore factors that may lead to the decay in the effectiveness of similar interventions over time and strategies to improve retention.

Supplementary Material

Reviewer comments
Author's manuscript

Acknowledgments

We are grateful to all of the parents who participated in the trial and particularly those who participated in the design and user testing of the intervention, the interviews and focus group discussions. We are also grateful to the research assistants for their observations and participation. We would like to thank Linda Biesty, Patricia Healy and Vanesa Ringle for feedback on a draft of this report. We are grateful for support for this research from the Global Health and Vaccination Research (GLOBVAC) programme of the Research Council of Norway, and to the English National Institute for Health Research for supporting Iain Chalmers and the James Lind Initiative. This work was also partially supported by a Career Development Award from the DELTAS Africa Initiative grant # DEL-15-011 to THRiVE-2. The DELTAS Africa Initiative is an independent funding scheme of the African Academy of Sciences (AAS)’s Alliance for Accelerating Excellence in Science in Africa (AESA) and supported by the New Partnership for Africa’s Development Planning and Coordinating Agency (NEPAD Agency) with funding from the Wellcome Trust grant # 107742/Z/15/Z and the UK government. The views expressed in this publication are those of the author(s) and not necessarily those of AAS, NEPAD Agency, Wellcome Trust or the UK government. We are also grateful to Martin Mutyaba, Esther Nakyejwe, Margaret Nabatanzi, Hilda Mwebaza, Peter Lukwata, Rita Tukahirwa, David simbwa, Adonia Lwanga, Enock Steven Ddamulira and Solomon Segawa for their help with data management; and all the research assistants who helped with data collection and entry. We would also like to thank the Informed Health Choices advisory groups for their support and advice in implementing this project.

Footnotes

Twitter: @Dansemakula, @AllenNsangi

Contributors: DS is the principal investigator. He drafted the protocol with input from all the other coauthors and was responsible for planning and data collection. DS, AN, AO, AAD, CG, SL, MK, MO, SR, AAD, AF and NS participated in the planning of the study. DS and AN collected the data and led the data analysis; CJR conducted the quantitative data analysis; AO, CG, SL, MK, MO, SR, AAD, AF and NS participated in the analyses, interpretation and organisation of findings. DS wrote the first draft of the manuscript. All of the authors reviewed and commented on earlier drafts and contributed to the final manuscript. All authors reviewed and approved the final version of the manuscript.

Funding: This trial was funded by the Research Council of Norway, Project number 220603/H10.

Competing interests: DS is a medical doctor, epidemiologist and health services researcher. AN is a social scientist. Having developed the intervention and interviewed participants about it, it is possible that our involvement in both processes might have in some way influenced how we asked questions or our interpretation of the responses. It is not known at this point whether indeed this occurred and the effect this might have on the participants’ responses.

Patient and public involvement statement: Included within the text of the manuscript

Patient consent for publication: Consent for publication of findngs in reports and or presentations was sought as part of the informed consent process for study participation. Additional consent for publication of participant-identifiable material was not required.

Ethics approval: Participants who were invited to participate in the process evaluation were informed of the purpose of their participation before written permission was obtained. Participants in the trial consented for both the initial assessment and the 1-year follow-up at the beginning of the study. Only consenting participants were included. The study was approved by Makerere University Institutional Review Board and the Uganda National Council of Science and Technology as part of the Supporting Informed Healthcare Choices in Low-income Countries Project (Grant no. ES498037).

Provenance and peer review: Not commissioned; externally peer reviewed.

Data availability statement: Data are available upon reasonable request. All data relevant to the study are included in the article or uploaded as supplementary information.

References

  • 1. Lokker N, Sanders L, Perrin EM, et al. Parental misinterpretations of over-the-counter pediatric cough and cold medication labels. Pediatrics 2009;123:1464–71. 10.1542/peds.2008-0854 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Gunther E, Christian K. How do consumers search for and appraise health information on the world wide web? qualitative study using focus groups, usability tests, and in-depth interviews. BMJ 2002;324. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Glenton C, Nilsen ES, Carlsen B. Lay perceptions of evidence-based information – a qualitative evaluation of a website for back pain sufferers. BMC Health Serv Res 2006;6:34 10.1186/1472-6963-6-34 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Sillence E, Briggs P, Harris PR, et al. How do patients evaluate and make use of online health information? Soc Sci Med 2007;64:1853–62. 10.1016/j.socscimed.2007.01.012 [DOI] [PubMed] [Google Scholar]
  • 5. Research Councils UK Public insight research 2017.
  • 6. Hoffmann TC, Del Mar C. Patients' expectations of the benefits and harms of treatments, screening, and tests: a systematic review. JAMA Intern Med 2015;175:274–86. 10.1001/jamainternmed.2014.6016 [DOI] [PubMed] [Google Scholar]
  • 7. Hoffmann TC, Del Mar C. Clinicians’ Expectations of the Benefits and Harms of Treatments, Screening, and Tests. JAMA Intern Med 2017;177:407–19. 10.1001/jamainternmed.2016.8254 [DOI] [PubMed] [Google Scholar]
  • 8. Casiday R, Cresswell T, Wilson D, et al. A survey of UK parental attitudes to the MMR vaccine and trust in medical authority. Vaccine 2006;24:177–84. 10.1016/j.vaccine.2005.07.063 [DOI] [PubMed] [Google Scholar]
  • 9. Hadjikoumi I, Niekerk K, Scott C. Mmr catch up campaign: reasons for refusal to consent. Arch Dis Child 2006;91:621–2. 10.1136/adc.2005.088898 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Mills E, Jadad AR, Ross C, et al. Systematic review of qualitative studies exploring parental beliefs and attitudes toward childhood vaccination identifies common barriers to vaccination. J Clin Epidemiol 2005;58:1081–8. 10.1016/j.jclinepi.2005.09.002 [DOI] [PubMed] [Google Scholar]
  • 11. Berkman ND, Sheridan SL, Donahue KE, et al. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med 2011;155:97 10.7326/0003-4819-155-2-201107190-00005 [DOI] [PubMed] [Google Scholar]
  • 12. Eichler K, Wieser S, Brügger U. The costs of limited health literacy: a systematic review. Int J Public Health 2009;54:313–24. 10.1007/s00038-009-0058-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Ngoh LN. Health literacy: a barrier to pharmacist–patient communication and medication adherence. Journal of the American Pharmacists Association 2009;49:e132–49. quiz e147-9 10.1331/JAPhA.2009.07075 [DOI] [PubMed] [Google Scholar]
  • 14. Johnson SB, Park HS, Gross CP, et al. Complementary medicine, refusal of conventional cancer therapy, and survival among patients with curable cancers. JAMA Oncol 2018;4:1375–81. 10.1001/jamaoncol.2018.2487 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Austvoll-Dahlgren A, Oxman AD, Chalmers I, et al. Key concepts that people need to understand to assess claims about treatment effects. J Evid Based Med 2015;8:112–25. 10.1111/jebm.12160 [DOI] [PubMed] [Google Scholar]
  • 16. Chalmers I, Oxman AD, Austvoll-Dahlgren A, et al. Key concepts for informed health choices: a framework for helping people learn how to assess treatment claims and make informed choices. Bmj Ebm 2018;23:29–33. 10.1136/ebmed-2017-110829 [DOI] [PubMed] [Google Scholar]
  • 17. Semakula D, Nsangi A, Oxman M, et al. Development of mass media resources to improve the ability of parents of primary school children in Uganda to assess the trustworthiness of claims about the benefits and harms of treatments. IHC working paper 2018. Report No.: ISBN: 978-82-8082-903-0. [DOI] [PMC free article] [PubMed]
  • 18. Semakula D, Nsangi A, Glenton C, et al. An educational podcast to improve the ability of parents of primary school children in Uganda to assess claims about treatment effects: process evaluation protocol; informed health choices working paper 2017. Report No.: ISBN 978-82-8082-804-0. [DOI] [PubMed]
  • 19. Semakula D, Nsangi A, Oxman AD, et al. Effects of the informed health choices podcast on the ability of parents of primary school children in Uganda to assess claims about treatment effects: a randomised controlled trial. The Lancet 2017;390:389–98. 10.1016/S0140-6736(17)31225-4 [DOI] [PubMed] [Google Scholar]
  • 20. Nsangi A, Semakula D, Oxman AD, et al. Effects of the informed health choices primary school intervention on the ability of children in Uganda to assess the reliability of claims about treatment effects: a cluster-randomised controlled trial. The Lancet 2017;390:374–88. 10.1016/S0140-6736(17)31226-6 [DOI] [PubMed] [Google Scholar]
  • 21.Semakula D, Nsangi A, Oxman AD, et al. Effects of the informed health choices podcast on the ability of parents of primary school children in Uganda to assess the trustworthiness of claims about treatment effects, one-year follow-up: a randomised trial. [DOI] [PMC free article] [PubMed]
  • 22.Nsangi A, Semakula D, Oxman AD, et al. Effects of the informed health choices primary school intervention on the ability of children in Uganda to assess the reliability of claims about treatment effects, a one-year follow-up: a cluster-randomised controlled trial. [DOI] [PubMed]
  • 23. Oakley A, Strange V, Bonell C, et al. Process evaluation in randomised controlled trials of complex interventions. BMJ 2006;332:413–6. 10.1136/bmj.332.7538.413 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Moore GF, Audrey S, Barker M, et al. Process evaluation of complex interventions: medical Research Council guidance. BMJ 2015;350 10.1136/bmj.h1258 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Carroll C, Patterson M, Wood S, et al. A conceptual framework for implementation fidelity. Implementation Sci 2007;2 10.1186/1748-5908-2-40 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Rubio-Valera M, Pons-Vigués M, Martínez-Andrés M, et al. Barriers and facilitators for the implementation of primary prevention and health promotion activities in primary care: a synthesis through meta-ethnography. PLoS One 2014;9:e89554 10.1371/journal.pone.0089554 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Cavill N, Bauman A. Changing the way people think about health-enhancing physical activity: do mass media campaigns have a role? J Sports Sci 2004;22:771–90. 10.1080/02640410410001712467 [DOI] [PubMed] [Google Scholar]
  • 28. Chaudoir SR, Dugan AG, Barr CHI. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implementation Sci 2013;8 10.1186/1748-5908-8-22 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Hubley JH. Barriers to health education in developing countries. Health Educ Res 1986;1:233–45. 10.1093/her/1.4.233 [DOI] [Google Scholar]
  • 30. Ryan P. Integrated theory of health behavior change: background and intervention development. Clin. Nurse Spec 2009;23:161–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Flottorp SA, Oxman AD, Krause J, et al. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implementation Sci 2013;8 10.1186/1748-5908-8-35 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Nsangi A, Semakula D, Oxman M, et al. Resources to teach children in low-income countries to assess claims about treatment effects: process evaluation protocol. IHC working paper. Oslo 2016. [Google Scholar]
  • 33. Semakula D, Nsangi A, Oxman AD, et al. Process evaluation for a randomised trial of a podcast for improving the ability of parents in Uganda to assess the reliability of claims about treatment effects 2017.
  • 34. Guest G, Bunce A, Johnson L. How Many Interviews Are Enough ? An Experiment with Data Saturation and Variability. Fam. Heal. Int 2006;18:59–82. [Google Scholar]
  • 35. Carlsen B, Glenton C. What about N? A methodological study of sample-size reporting in focus group studies. BMC Med Res Methodol 2011;11 10.1186/1471-2288-11-26 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Ward DJ, Furber C, Tierney S, et al. Using framework analysis in nursing research: a worked example. J Adv Nurs 2013;69:2423–31. 10.1111/jan.12127 [DOI] [PubMed] [Google Scholar]
  • 37. Lewin S, Glenton C, Munthe-Kaas H, et al. Using qualitative evidence in decision making for health and social interventions: an approach to assess confidence in findings from qualitative evidence syntheses (GRADE-CERQual). PLoS Med 2015;12:e1001895 10.1371/journal.pmed.1001895 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Nsangi A, Semakula D, Oxman AD, et al. The informed health choices intervention to teach primary school children in low-income countries to assess claims about treatment effects: process evaluation 2017. [DOI] [PMC free article] [PubMed]
  • 39. Duschl RA, Schweingruber HA, Shouse AW, Committee on Science Learning, Kindergarten Through Eighth Grade. How children learn science. In: Taking science to school: Learning and teaching science in grades K-8. Teach. Sci. Washington, DC: The National Academies Press, 2007. [Google Scholar]
  • 40. Vosniadou S. International Handbook of research on conceptual change. 2nd Oxford: Routledge Learning, 2013. [Google Scholar]
  • 41. Cromley J, Think Lto. Learning to learn: what the science of thinking and learning has to offer adult education. Washington, DC: National Institute for Literacy, 2000. [Google Scholar]
  • 42. Abrami PC, Bernard RM, Borokhovski E, et al. Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis. Rev Educ Res 2008;78:1102–34. 10.3102/0034654308326084 [DOI] [Google Scholar]
  • 43. IDEO.org The field guide to Human-Centered design. 1st ed, 2015. [Google Scholar]
  • 44. Rosenbaum SE. Improving the user experience of evidence: a design approach to evidence-informed health care. Arkitektur- OG designhøgskolen I Oslo 2011.
  • 45. Knowles ES, Nathan KT, Kobi T. Acquiescent responding in Self-Reports: cognitive style or social concern? J Res Pers 1997;31:293–301. 10.1006/jrpe.1997.2180 [DOI] [Google Scholar]
  • 46. Semakula D, Nsangi A, Oxman AD, et al. Priority setting for resources to improve the understanding of information about claims of treatment effects in the mass media. J Evid Based Med 2015;8:84–90. 10.1111/jebm.12153 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary data

bmjopen-2019-031510supp001.pdf (54.7KB, pdf)

Reviewer comments
Author's manuscript

Articles from BMJ Open are provided here courtesy of BMJ Publishing Group

RESOURCES