Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2022 Sep 3;62(9):1222–1226. doi: 10.1111/head.14368

Cluster headache – The worst possible pain on YouTube

Basit Ali Chaudhry 1, Thien Phu Do 1,2, Håkan Ashina 1,3,4, Messoud Ashina 1,2, Faisal Mohammad Amin 1,3,
PMCID: PMC9826041  PMID: 36056715

Abstract

In clinical practice, patients with cluster headache often ask questions or mention information that they have seen or heard on the Internet. Because YouTube (www.youtube.com) is the second most visited Web site worldwide and offers a plethora of video content, we found it timely to ascertain the quality of information on cluster headache that is freely available on YouTube. We conducted an inquiry on YouTube on January 24, 2022, with the search term “cluster headache.” Eligible YouTube videos included those with ≥10,000 views and content related to cluster headache. We assessed the quality and reliability of the videos with the Global Quality Scale and DISCERN, respectively. The search strategy identified 644 videos of which 134 were eligible for inclusion. The sources of the included videos were categorized as “healthcare professional/institution” (n = 45), “personal experience” (n = 52), and “other” (n = 37). According to the Global Quality Scale, 70 (52%) were low quality, 34 (25%) were of moderate quality, and 30 (22%) were of high quality. According to DISCERN, 104 (78%) were of low reliability, 28 (21%) were of moderate reliability, and 2 (1%) were of high reliability. The quality and reliability of cluster headache‐related information on YouTube has room for improvement, even the content provided by healthcare providers. These findings should incentivize stakeholders, for example, government services, professional societies, healthcare providers, to provide accessible and better information on cluster headache.

Keywords: consumer health information, digital, education, online, patient perspective, social media


Abbreviations

GQS

Global Quality Scale

IQR

interquartile range

SD

standard deviation

INTRODUCTION

Cluster headache is a rare and disabling neurologic disease that afflicts ~0.1% of people in the general population. Broad clinical features include recurrent attacks of excruciating unilateral headache, with a duration of 15 to 180 min (when untreated). 1 These attacks are often accompanied by ipsilateral cranial autonomic symptoms and a sense of restlessness or agitation. 2

The disease burden attributed to cluster headache is high and US‐based survey data have found that more than half of those affected experience suicidal ideation and 2% have attempted suicide. 3 In this context, it merits emphasis that cluster headache is often misdiagnosed and a diagnostic delay of ~10 years following disease onset is common. 4 Increasing efforts are therefore warranted to address unmet needs and should include provision of reliable information from online‐based resources.

In clinical practice, patients with cluster headache often ask questions or mention information that they have seen or heard on the Internet. This might interfere with patient care in both a positive and negative manner. Some patients will be more informed about their disease while others might have obtained information that is inaccurate or, even worse, contradicts best practices for effective disease management. Because YouTube (www.youtube.com) is the second most visited Web site worldwide and offers a plethora of video content, we found it timely to examine the quality of information on cluster headache that is freely available on YouTube.

METHODS

Search strategy and analysis

The Web browser Mozilla Firefox (version 92.0) was used to conduct a YouTube search on January 24, 2022, using the Private Browsing mode. The search term was “cluster headache” and all tracking cookies had been deleted prior to the search. Eligible YouTube videos included those with ≥10,000 views and content related to cluster headache. No language restrictions were applied. Two independent investigators (B.C. and T.P.D.) extracted information on the (1) source of the video (source was categorized into the following: [a] healthcare professional/institution, [b] personal experience, [c] other), (2) number of days available on YouTube, (3) duration of the video, (4) number of views, (5) number of comments, and (6) number of likes.

Quality assessment

Quality assessment was performed using the Global Quality Scale (GQS), which consists of a 5‐point Likert scale evaluating the quality, flow, and ease of use of each video. 5 A score of 1 represents a video with poor quality, poor flow with missing information, and no usefulness for the patient. A score of 5 represents a video of great quality, great flow with all relevant information, and high usefulness for the patient. The GQS instrument has been widely used to evaluate the quality of health information in YouTube videos. 6 Higher GQS scores indicate better quality and flow as well as greater usefulness for the patient. GQS scores of <3 points were defined as poor quality, scores of 3 points indicated moderate quality, and scores of >3 were defined as high quality.

Reliability assessment

Reliability assessment was performed using a modified DISCERN tool, which includes two sections with 15 questions total (section 1: is the publication reliable; section 2: how good is the quality of information on treatment choices), each of which are rated on a scale from 1 to 5. 7 Based on these, raters give a total score from 1 to 5 in a third section (overall rating). The DISCERN tool was developed to ascertain the reliability of written information but has also been used to evaluate health information from YouTube videos. 8 Higher mean total scores are indicative of higher reliability. DISCERN total score of <3 points were defined as poor reliability, total score of 3 points indicated moderate reliability, and total score of >3 was defined as high reliability.

Statistical analysis

Descriptive statistics were used to describe characteristics of the included videos. Data with a normal distribution were presented with means and standard deviations (SD). If data had a skewed distribution, median value interquartile ranges (IQR) were calculated. All analyses were performed using Microsoft Excel (version 2109).

RESULTS

The YouTube search identified 644 videos of which 134 were eligible for inclusion (Figure 1). The sources of the included videos were categorized as “healthcare professional/institution” (n = 45), “personal experience” (n = 52), and “other” (n = 37). Characteristics of the included YouTube videos are presented in Table 1.

FIGURE 1.

FIGURE 1

Flow diagram of screening and study inclusion.

TABLE 1.

Characteristics of popular YouTube videos on cluster headache

Total Healthcare institution/professional Personal experience Other
No. of videos 134 45 52 37
Quality, Global Quality Scale 2.44 (±1.18) 3.29 (±0.86) 1.69 (±0.91) 2.46 (±1.15)
Reliability, DISCERN, mean (SD)
Section 1 2.33 (±1.62) 2.79 (±1.85) 1.99 (±1.28) 2.25 (±1.62)
Section 2 1.66 (±1.25) 1.77 (±1.40) 1.68 (±1.03) 1.48 (±1.01)
Section 3 1.59 (±0.86) 2.13 (±0.88) 1.19 (±0.56) 1.51 (±0.86)
No. of months available on YouTube, median (IQR) 76.96 (48.5–105.7) 71.76 (41.72–102.35) 92.07 (67.04–129.78) 71.28 (43.04–90.38)
Duration, min, median (IQR) 5.2 (3.1–10.1) 6.38 (3.52–12.68) 5.56 (2.53–9.79) 4.48 (3.23–7.70)
No. of views, median (IQR) 29,708.5 (17,756‐29,708.5) 33,162 (20,299‐76,887) 27,668 (14,323.50‐56,045.25) 28,532 (16,329‐90,053)
Popularity, median (IQR) 15.67 (7.52–38.93) 17.42 (8.89–47.79) 9.43 (6.41–24.55) 19.44 (12.18–42.52)
No. of comments, median (IQR) 87.5 (37–211.25) 102 (38.25–219.75) 82.50 (42.25–169.75) 85 (23.75–241.50)
No. of likes, median (IQR) 216 (88.25–622) 278.50 (159.50–650) 91 (50–346) 307 (181–925)

Note: Descriptive data of the videos in the categories “healthcare institution/professional,” “personal experience,” and “other”.

Abbreviations: IQR, interquartile range; SD, standard deviation.

Quality

Of the 45 videos deriving from “healthcare professional/institution,” 9 (20%) were of low quality, 17 (38%) were of moderate quality, and 19 (42%) were of high quality. Of the 52 videos deriving from “personal experience,” 40 (77%) were of low quality, 10 (19%) were of moderate quality, and 2 (4%) were of high quality. Of the 37 videos in the category “other,” 21 (57%) were of poor quality, 7 (19%) were of moderate quality, and 9 (24%) were of high quality. Figure 2 shows the quality assessment according to source.

FIGURE 2.

FIGURE 2

Quality assessment stratified according to video category. The distribution of the Global Quality Scale score for the categories, “healthcare institution/professional,” “personal experience,” and “other.” Color represents quality score of videos, dark: high (4, 5), medium: moderate (3), and light: low (1, 2). Numbers in the figure represents the amount of videos being of high, medium, or low quality in each category. Healthcare institution/professional (high: n = 19, medium: n = 17, low: n = 9), personal experience (high: n = 2, medium: n = 10, low: n = 40), and other (high: n = 9, medium: n = 7, low: n = 21).

Reliability

Of the 45 videos deriving from “healthcare professional/institution,” 26 (58%) were of low reliability, 18 (40%) were of moderate reliability, and 1 (2%) were high reliability. Of the 52 videos deriving from “personal experience,” 46 (88%) were of low reliability, and 4 (8%) were moderate reliability. No video from this source was of high reliability. Of the 37 videos in the category “other,” 30 (81%) were of low reliability, 6 (16%) were of moderate reliability, and 1 (3%) were of high reliability. Figure 3 shows the reliability assessment according to source.

FIGURE 3.

FIGURE 3

Reliability assessment stratified according to video category. The distribution of the DISCERN score (section 3: overall rating) for the categories, “healthcare institution/professional,” “personal experience,” and “other.” Color represents overall reliability of videos, dark: high (4, 5), medium: moderate (3), and light: low (1, 2). Numbers in the figure represent the amount of videos being of high, medium, or low reliability in each category. Healthcare Institution/professional (high: n = 1, medium: n = 18, low: n = 26), personal experience (high: n = 0, medium: n = 4, low: n = 46), and other (high: n = 1, medium: n = 6, low: n = 30).

DISCUSSION

To our knowledge, this is the first study to evaluate the quality and reliability of information regarding cluster headache on YouTube. The 134 videos had more than 1 million views, which suggests that there is an interest in content with information on cluster headache on YouTube. The sources of the videos were categorized as “healthcare professional/institution” (n = 45), “personal experience” (n = 52) and “other” (n = 37), respectively.

As most videos were produced by sources other than healthcare professionals, this may increase the risk of encountering misinformation related to cluster headache on YouTube. “Personal experience” was the most frequent source and also had the highest number of views, which is in line with reports from other diseases, for example, stroke and cancer. 9 , 10 Anecdotes are considered a compelling and efficient tool for conveying medical information, 11 which may provide an explanation for why content from peers has a higher outreach. 12 Of note, the receptivity to a message is dependent on the a priori knowledge of the topic. 12 It is tempting to assume that information on pathophysiology and similar technical topics will have a more narrow audience for the same reason, but the baseline knowledge of patients with cluster headache is a relatively unexplored topic.

The GQS is a five‐point Likert scale rating the quality, flow, and ease of use of online information. According to the GQS, the overall quality of the included videos was poor. In line with previous reports of other diseases, 13 videos provided by healthcare professionals more often had a high quality. 14 However, our observations show an overall minor difference between the quality of content in the category “personal experience” and “other” compared to “healthcare professional/institution.” These findings may be surprising as one would expect the quality of content provided by healthcare professionals to be higher than non‐professionals, but the GQS primarily rates on technical quality and delivery of content of the videos rather than the actual information of the content. 15

The DISCERN instrument rates the reliability and favors content with a clear purpose, framework, and multiple solutions. The overall reliability of videos was low. DISCERN consists of three sections. Section 1 evaluates presentation of source and references of content/information, which most videos did not provide. In turn, most videos had a low score on this section. Likewise, section 2 evaluates presentation of treatment options. Most videos rated low in this section. One may speculate if this is caused by a low number of available treatment options for cluster headache in general, but this is unlikely to be the only driver of our findings as these trends are similar to observations in other medical disorders, for example, cancer and migraine. 9 , 16 Another shortcoming is that sources used to produce the content of the videos are not specified. These challenges are also met in video content across other diseases and healthcare topics. 10 , 17 As previous reports suggest these factors play an important role in patient education in headache disorders, further efforts should be made to improve in this area. 16 , 18

Strengths and limitations

As new online content is constantly updated in real time, the present cross‐sectional design is a limitation; however, this is an inherent limitation of all investigations of online content. The investigators were not blinded to the source of the video content, which may introduce a bias, but extraction of data in a blinded manner (e.g., web scraping) is considered a violation of the terms of use.

CONCLUSIONS

The quality and reliability of cluster headache‐related information on YouTube has room for improvement, even the content provided by healthcare professionals and institutions. These findings should incentivize stakeholders, e.g., government services, professional societies, healthcare providers, to provide accessible and better information on cluster headache.

AUTHOR CONTRIBUTIONS

Study concept and design: Thien Phu Do, Håkan Ashina, Messoud Ashina, Faisal Mohammad Amin. Acquisition of data: Basit Ali Chaudhry, Thien Phu Do. Analysis and interpretation of data: Basit Ali Chaudhry, Thien Phu Do, Håkan Ashina, Messoud Ashina, Faisal Mohammad Amin. Drafting of the manuscript: Basit Ali Chaudhry. Revising it for intellectual content: Thien Phu Do, Håkan Ashina, Messoud Ashina, Faisal Mohammad Amin. Final approval of the completed manuscript: Basit Ali Chaudhry, Thien Phu Do, Håkan Ashina, Messoud Ashina, Faisal Mohammad Amin.

CONFLICT OF INTEREST

B.A.C reports no conflicts of interest. T.P.D reports no conflict of interest. H.A reports personal fees from Teva. M.A is a consultant, speaker or scientific advisor for AbbVie, Allergan, Amgen, Eli Lilly, Lundbeck, Novartis, and Teva and a primary investigator for ongoing AbbVie/Allergan, Amgen, Eli Lilly, Lundbeck, Novartis and Teva trials. M.A has no ownership interest and does not own stocks of any pharmaceutical company. M.A serves as associate editor of Cephalalgia, associate editor of the Journal of Headache and Pain, and associate editor of Brain. F.M.A has received honoraria and personal fees from Teva, Lundbeck, Norvatis, Eli Lilly and Pfizer for lecturing or participating in advisory boards.

ACKNOWLEDGMENT

Messoud Ashina was supported by the Lundbeck Foundation Professor Grant (R310‐2018‐3711).

Chaudhry BA, Do TP, Ashina H, Ashina M, Amin FM. Cluster headache – The worst possible pain on YouTube . Headache. 2022;62:1222‐1226. doi: 10.1111/head.14368

REFERENCES

  • 1. May A, Schwedt TJ, Magis D, Pozo‐Rosich P, Evers S, Wang SJ. Cluster headache. Nat Rev Dis Primers. 2018;4:1‐17. [DOI] [PubMed] [Google Scholar]
  • 2. Olesen J. Headache Classification Committee of the International Headache Society (IHS) The International Classification of Headache Disorders, 3rd edition. Cephalalgia. 2018;38:1‐211. [DOI] [PubMed] [Google Scholar]
  • 3. Rozen TD, Fishman RS. Cluster headache in the United States of America: demographics, clinical characteristics, triggers, suicidality, and personal burden. Headache. 2012;52:99‐113. [DOI] [PubMed] [Google Scholar]
  • 4. Frederiksen HH, Lund NLT, Barloese MCJ, Petersen AS, Jensen RH. Diagnostic delay of cluster headache: A cohort study from the Danish Cluster Headache Survey. Cephalalgia. 2020;40:49‐56. [DOI] [PubMed] [Google Scholar]
  • 5. Bernard A, Langille M, Hughes S, Rose C, Leddin D, Veldhuyzen van Zanten S. A systematic review of patient inflammatory bowel disease information resources on the world wide web. Am J Gastroenterol. 2007;102:2070‐2077. [DOI] [PubMed] [Google Scholar]
  • 6. Çintesun FNİ, Çintesun E, Seçilmiş Ö. YouTube as a source of information on gonadotropin self‐injections. Eur J Obstet Gynecol Reprod Biol. 2021;264:135‐140. [DOI] [PubMed] [Google Scholar]
  • 7. Charnock D, Shepperd S, Needham G, Gann R. DISCERN: An instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53:105‐111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Goobie GC, Guler SA, Johannson KA, Fisher JH, Ryerson CJ. YouTube videos as a source of misinformation on idiopathic pulmonary fibrosis. Ann Am Thorac Soc. 2019;16:572‐579. [DOI] [PubMed] [Google Scholar]
  • 9. Bahar‐Ozdemir Y, Ozsoy‐Unubol T, Akyuz G. Is YouTube a high‐quality source of information on cancer rehabilitation? J Cancer Surviv. 2021. Online ahead of print. doi: 10.1007/s11764-021-01093-9 [DOI] [PubMed] [Google Scholar]
  • 10. Chen Y, Abel KT, Cramer SC, Zheng K, Chen Y. Recovery in my lens: a study on stroke vlogs. AMIA Annu Symp Proc. 2018;2018:1300‐1309. [PMC free article] [PubMed] [Google Scholar]
  • 11. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974;185:1124‐1131. [DOI] [PubMed] [Google Scholar]
  • 12. Fagerlin A, Wang C, Ubel PA. Reducing the influence of anecdotal reasoning on people's health care decisions: Is a picture worth a thousand statistics? Med Decis Making. 2005;25:398‐405. [DOI] [PubMed] [Google Scholar]
  • 13. Ng MK, Emara AK, Molloy RM, Krebs VE, Mont M, Piuzzi NS. YouTube as a source of patient information for total knee/hip arthroplasty: quantitative analysis of video reliability, quality, and content. J Am Acad Orthop Surg. 2021;29:e1034‐e1044. [DOI] [PubMed] [Google Scholar]
  • 14. Onder ME, Zengin O. YouTube as a source of information on gout: a quality analysis. Rheumatol Int. 2021;41:1321‐1328. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Ranjan P, Kumari A, Chakrawarty A. How can doctors improve their communication skills? J Clin Diagn Res. 2015;9:JE01‐JE04. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Saffi H, Do TP, Hansen JM, Dodick DW, Ashina M. The migraine landscape on YouTube: a review of YouTube as a source of information on migraine. Cephalalgia. 2020;40:1363‐1369. [DOI] [PubMed] [Google Scholar]
  • 17. Haslam K, Doucette H, Hachey S, et al. YouTube videos as health decision aids for the public: an integrative review. Can J Dent Hyg. 2019;53:53‐66. [PMC free article] [PubMed] [Google Scholar]
  • 18. Cady R, Farmer K, Beach ME, Tarrasch J. Nurse‐based education: an office‐based comparative model for education of migraine patients. Headache. 2008;48:564‐569. [DOI] [PubMed] [Google Scholar]

Articles from Headache are provided here courtesy of Wiley

RESOURCES