Skip to main content
Journal of Hand Surgery Global Online logoLink to Journal of Hand Surgery Global Online
. 2021 Dec 1;4(1):3–7. doi: 10.1016/j.jhsg.2021.10.009

Distal Biceps Tendon Rupture Videos on YouTube: An Analysis of Video Content and Quality

Brian K Foster , William Mack Malarkey , Timothy C Maurer , Daniela F Barreto Rocha , Idorenyin F Udoeyo , Louis C Grandizio ∗,
PMCID: PMC8991868  PMID: 35415601

Abstract

Purpose

Our purpose was to analyze the content and quality of YouTube videos related to distal biceps tendon (DBT) ruptures and repair. We aimed to compare differences between academic and nonacademic video sources.

Methods

The most popular YouTube videos related to DBT injuries were compiled and analyzed according to source. Viewing characteristics were determined for each video. Video content and quality were assessed by 2 reviewers and analyzed according to the Journal of the American Medical Association benchmark criteria, DISCERN criteria, and a Distal Biceps Content Score. Cohen’s kappa was used to measure interrater reliability.

Results

A total of 59 DBT YouTube videos were included. The intraclass correlation coefficients ranged from moderate to excellent for the content scores. The mean DISCERN score was 29, and no videos were rated as either “good” or “excellent” for content quality. With the exception of the mean Journal of the American Medical Association criteria score (1.5 vs 0.5), videos from academic sources did not demonstrate significantly higher levels of content quality. Only 4/59 videos (7%) discussed the natural history of nonsurgically treated DBT ruptures. Of the 32 videos that discussed surgical techniques, only 3/32 (9%) had a preference for 2-incision techniques. No videos discussed the association between spontaneous DBT ruptures and cardiac amyloidosis.

Conclusions

The overall content, quality, and reliability of DBT videos on YouTube are poor. Videos from academic sources do not provide higher-quality information than videos from nonacademic sources. Videos related to operative treatment of DBT ruptures more frequently discuss single-incision techniques.

Clinical relevance

Social media videos can function as direct-to-consumer marketing materials, and surgeons should be prepared to address misconceptions regarding the management of DBT tears. Patients are increasingly seeking health information online, and surgeons should direct patients toward more reliable and vetted sources of information.

Key words: Distal biceps repair, Distal biceps rupture, Patient education, Social media, YouTube


Distal biceps tendon (DBT) ruptures are a common upper-extremity injury, particularly among middle-aged men.1 The optimal management of this injury remains controversial. Nonsurgical treatment results in predictable decreases in supination and flexion strength; however, many patients are able to return to work and demonstrate near-normal functional outcome scores without surgery.2 Operative treatment is typically indicated for patients who would have functional limitations associated with decreased supination and flexion strength. Both single- and 2-incision surgical approaches have been described. Although the 2-incision approach can more reliability place the repaired tendon in the anatomic footprint on the biceps tuberosity, both techniques have resulted in similar functional outcomes.3, 4, 5, 6

In many instances, operative treatment of DBT ruptures represents an elective and discretionary procedure. In this context, management decisions often involve shared decision making between the patient and surgeon. Hand dominance, age, occupation, and the patient’s values and risk tolerance are a few of the factors that can drive decision making. Patients may seek information from a variety of sources both before and after their upper-extremity evaluation.7 Although not specific to DBT ruptures, prior studies analyzing online patient information for upper-extremity conditions have indicated that information from general websites can be of variable quality and that information from more vetted medical sources often exceeds recommended readability levels.8, 9, 10

Social media is being increasingly used by upper-extremity patients to research their health conditions.11 YouTube remains the dominant online video platform and has over 2 billion users.12 Although YouTube hosts a variety of videos aimed at patient education and information, there is no peer-review process. Previous investigations analyzing YouTube videos related to rotator cuff tears, anterior cruciate ligament injuries, and total joint arthroplasty have indicated that these videos are often a poor source of patient information.13, 14, 15 Anecdotally, we find it rather common for patients with DBT ruptures to state that they have already watched a number of YouTube videos either at the time of their initial orthopedic visit or in the preoperative area. The quality, reliability, and overall content of these YouTube DBT videos remain uncertain.

The purpose of this investigation was to analyze the content and quality of YouTube videos related to DBT ruptures and repair. In addition, we aimed to compare differences between videos uploaded by academic versus nonacademic sources. We hypothesized that the overall video quality and content with respect to patient education would be poor.

Materials and Methods

Institutional review board exemption was obtained from Geisinger Health System for this investigation. We used a methodology similar to that of Ng et al15 for this study. In order to obtain videos related to DBT ruptures, we performed a query of the YouTube platform on November 28, 2020. YouTube Health was not specifically searched. Four separate searches were performed using the terms: “Distal Biceps Repair,” “Distal Biceps Surgery,” “Distal Biceps Rupture,” and “Distal Biceps Tear.” The top 40 videos from each search result (160 total videos) were compiled on a spreadsheet. We then excluded videos that were recorded in a language other than English, lacked audio, or were completely unrelated to distal biceps surgery. With these exclusions and after eliminating duplicate videos, 59 unique videos remained.

On the day the search was performed, we recoded the video source and the viewing characteristics. Similar to the approach used by Ng et al,15 video sources were categorized into 1 of 5 groups: academic, physician, nonphysician/trainer, patient, or commercial. For comparisons based on video source, we dichotomized the sources to academic and nonacademic videos, with the latter including all other videos not within the academic designation. Additionally, video viewing characteristics were recoded, including the numbers of views, likes, dislikes, comments, and days since upload, as well as the video duration. The view ratio was defined as views/day. We also recorded the like ratio ([number of likes × 100] / [number of likes + number of dislikes]) and the video power (like ratio × view ratio/100) for videos that had enabled comments and likes.15,16

Two authors (B.K.F. and W.M.M.) independently viewed each of the included videos. The Journal of the American Medical Association (JAMA) Benchmark Criteria was applied to each video.17 The JAMA criteria relies on a binary scoring system and assesses 4 areas of video reliability: Authorship, Attribution, Currency and Disclosure (Appendix 1, available on the Journal’s website at www.jhsgo.org). These criteria allow both patients and medical professionals to judge the quality, reliability, and usefulness of the information being viewed. Ratings from 0 to 4 were recorded for each video, with higher scores indicating more reliable videos.

The quality of the video content was assessed using the DISCERN criteria.18 Reviewers accessed the DISCERN handbook while evaluating each video (http://www.discern.org.uk/discern_instrument.php). The DISCERN instrument is a widely used quality assessment tool and is comprised of 16 questions divided into 3 sections related to publication reliability, quality, and overall rating. Each question is scored on a scale from 1 to 5, resulting in a maximum score of 80. Scores are categorized as “very poor” (16–18), “poor” (29–41), “fair” (42–54), “good” (55–67), and “excellent” (68–80).18

Specific to this study, the senior author (L.C.G.), a fellowship-trained hand and upper-extremity surgeon, developed a Distal Biceps Content Score (DBCS), which was adapted from lower-extremity content scores for prior investigations of YouTube video quality (Appendix 2, available on the Journal’s website at www.jhsgo.org).15,19 The DBCS analyzed 10 content points: patient presentation, patient populations, diagnosis, nonsurgical treatment, surgical options, surgical candidates, surgical approaches (single- vs 2-incision), rehab/recovery, postoperative restrictions, and complications related to surgery. Videos were scored from 0 to 10, with 10 indicating that all of the content points were discussed. Although discussion of the association between cardiac amyloidosis and spontaneous distal biceps ruptures was not part of the DBCS, the reviewers did note whether this association was discussed during the video.20 Additionally, the reviewers also noted whether the video contained any references to brand-specific implants.

Statistics

Descriptive statistics were generated for video characteristics, such as the number of views, video duration, likes, and comments. Frequencies and percentages were reported for categorical variables. Means and standard deviations, as well as medians and interquartile ranges, where appropriate, were reported for continuous variables.

Journal of the American Medical Association, DISCERN, and DBCS data were generated by averaging the 2 reviewer’s scores for each individual video. For example, if reviewer 1 determined that a video’s DISCERN score was 30 and reviewer 2 determined it was 32, the DISCERN score for that video was reported as 31. Cohen’s kappa was used to measure interrater reliability for the JAMA Benchmark Criteria and DBCS, whereas the weighted Cohen’s kappa was used to measure interrater reliability for the DISCERN score to account for partiality in the scoring criteria.21

Intraclass correlation coefficients were used to measure interrater reliability in total scores between raters for each criterion. Furthermore, a Wilcoxon 2-sample test (Mann-Whitney U test) was used to compare the mean ranks in video characteristics with nonnormal data distribution and the independent 2-sample t test was used to compare differences in video characteristics with normal data distribution between the 2 types of video sources (ie, academic vs nonacademic). A P value of <.05 was considered statistically significant. The intraclass correlation coefficients indicated there was excellent agreement between the 2 raters for total JAMA scores (0.951), moderate agreement for the total DISCERN scores (0.727), and good agreement for the DBCS scores (0.783).

Prior to the initiation of the study, we performed an a priori sample size calculation. Using the study performed by Ng et al15 for Total Knee Arthroplasty Videos, the mean DISCERN score was 51 overall, with a standard deviation of 10.15 We wanted to power our investigation to detect a difference of 10 points for the DISCERN score. Ten points was selected because this number represents the approximate gradations between DISCERN categorizations (very poor, poor, fair, good, and excellent). Using an alpha of 0.05 and power of 80% and assuming a 2:1 ratio of nonacademic to academic videos, it was determined that a total of 36 videos would be required for analysis.

Results

A total of 59 unique YouTube videos were included in our analysis. Table 1 provides video characteristics for each of the included videos. Videos uploaded by nonacademic physicians were the most common video source (31%), followed by videos uploaded by academic centers (25%). Table 2 presents the mean JAMA, DISCERN, and DBCS. For the DISCERN score, only 5% were rated as “fair,” with no videos rated as either “good” or “excellent.” Table 3 includes comparisons of videos from academic versus nonacademic sources. Only the mean JAMA criteria score was significantly higher for academic videos than nonacademic videos (1.5 vs 0.5; P < .05).

Table 1.

Video Characteristics for All Included Distal Biceps YouTube Videos

Video Characteristic All Videos
N = 59
Video source, n (%)
 Academic 15 (25%)
 Physician 18 (31%)
 Nonphysician/trainer 10 (17%)
 Patient 10 (17%)
 Commercial 6 (10%)
Video duration in min
 Mean (SD) 7.3 (6)
 Median (IQR) 6.0 (3.0–9.0)
Views
 Mean (SD) 45,265 (107,483)
 Median (IQR) 9,249 (2,140–39,036)
Days since upload
 Mean (SD) 1,309 (966)
 Median (IQR) 1,095 (606–1,682)
View ratio
 Mean (SD) 43 (99)
 Median (IQR) 11 (2.2–26)
Likes
 Mean (SD) 220 (433)
 Median (IQR) 40 (7–234)
Dislikes
 Mean (SD) 13 (30)
 Median (IQR) 2.0 (0–9.0)
Like ratio
 Mean (SD) 95 (6)
 Median (IQR) 96 (93–100)
Video power
 Mean (SD) 42 (97)
 Median (IQR) 9 (2.2–21)
Comments
 Mean (SD) 40 (70)
 Median (IQR) 11 (0–59)

IQR, interquartile range.

Table 2.

Journal of the American Medical Association, DISCERN, and DBCS for All 59 Included Distal Biceps YouTube Videos

Content, Quality, and Reliability Value
JAMA criteria score, mean (SD) 0.7 (0.8)
DBCS, mean (SD) 2.4 (1.9)
DISCERN score, mean (SD) 28.6 (7.4)
DISCERN score category, n (%)
 Very poor (16–28) 35 (59%)
 Poor (29–41) 21 (36%)
 Fair (42–54) 3 (5%)
 Good (55–67) 0 (0%)
 Excellent (68–80) 0 (0%)

Table 3.

Comparison of Characteristics Between Academic and Nonacademic Videos

Video Characteristic Academic Videos
N = 15 (25%)
Nonacademic Videos
N = 44 (75%)
P Value
Video duration in min
 Mean (SD) 8.2 (7.3) 7.0 (6.2) .23
 Median (IQR) 7.0 (5.0–9.0) 5.0 (3.0, 8.5)
Views
 Mean (SD) 111,900 (190,142) 22,548 (41,647) .08
 Median (IQR) 22,759 (3,040–148,916) 8,868 (1,823–19,977)
Days since upload
 Mean (SD) 1,925 (1,026) 1,099 (860) <.05
 Median (IQR) 1,798 (1,452–2,886) 1,009 (484–1,409)
View ratio
 Mean (SD) 55 (83) 39 (106) .28
 Median (IQR) 15.7 (2.3–91) 9.3 (2.2–19)
Likes
 Mean (SD) 396 (693) 160 (286) .94
 Median (IQR) 26 (7–639) 45 (7–233)
Dislikes
 Mean (SD) 26 (51) 9 (16) .89
 Median (IQR) 2.0 (0–33.0) 2.0 (0–8.5)
Like ratio
 Mean (SD) 94 (8) 95 (6) .46
Video power
 Mean (SD) 56 (80) 38 (102) .27
 Median (IQR) 15 (3.0–88) 8 (1.9–18)
Comments
 Mean (SD) 54 (118) 35 (44) .62
 Median (IQR) 2.0 (0–64.0) 17.5 (0–54.5)
JAMA criteria score, mean (SD) 1.5 (0.8) 0.5 (0.7) <.05
DISCERN score, mean (SD) 30.0 (8.5) 28.1 (7.0) .38
Distal Biceps Content Specific Score, mean (SD) 2.7 (1.8) 2.2 (1.9) .39

IQR, interquartile range.

Only 4/59 videos (7%) discussed the natural history of nonsurgically treated DBT ruptures. All 6 (100%) of the videos uploaded by a commercial source contained references to brand-specific products, compared to 3/53 (6%) videos uploaded by noncommercial sources. Of the 32 videos that discussed or demonstrated surgical techniques, only 3/32 (9%) had a preference for the 2-incision technique. Overall, videos covered a mean of 2.4 content areas on the DBCS. Only 18% of videos discussed surgical complications. No videos discussed the association between spontaneous DBT ruptures and cardiac amyloidosis.

Discussion

Overall, the content and quality of YouTube videos pertaining to DBT ruptures is poor. Social media is being increasingly used by upper-extremity patients to research their health conditions, and YouTube contains a large volume of medically oriented videos.11,12 Unfortunately, despite their popularity, these videos appear to contain poor-quality patient information. Our results are similar to previous investigations analyzing YouTube videos related to rotator cuff tears, lower extremity injuries, and arthroplasty, which have indicated that these videos can be a poor source of patient information.13, 14, 15,22 In a recent analysis of total hip and knee arthroplasty videos on YouTube, Ng et al15 demonstrated that videos from academic sources demonstrated higher levels of quality than other nonacademic sources.15 Our findings stand in contrast to those reported by Ng et al,15 as DBT videos uploaded by academic sources did not have higher content or quality scores. Additionally, the overall quality and content of DBT videos are lower than those for total knee arthroplasty videos (mean DISCERN score of 29 compared to 51, respectively).15 For DBT videos, the top 3 videos with respect to views were all from academic sources and had DISCERN scores of 30 or less (poor).

YouTube videos related to DBT tears and operative treatment demonstrate low content quality. Discussions of the natural history of nonsurgical treatment of DBT ruptures were infrequent, and no videos discussed the association between DBT ruptures and cardiac amyloidosis, which can be found in 33% of patients with spontaneous DBT ruptures.20 Less than 10% of videos that discussed surgical treatment had a preference for the 2-incision technique. We believe this focus on operative management (particularly the single-incision approach) may be related to the influence of orthopedic industry partners, who more frequently develop implants and instrumentation for a single-incision anterior approach.

A central issue related to poor video content and quality is the impact this information can have on shared decision making. It can be easy to dismiss findings of poor or inaccurate online health information and simply suggest that patients seek other sources; however, surgeons may underestimate the impact of these internet sources on treatment decisions. Patients often seek information prior to their orthopedic visit.7 Upper-extremity patients are increasingly seeking information from social media and online sources, and this trend is likely to increase.11 In some cases, the YouTube videos serve as marketing material, which is not dissimilar to direct-to-consumer advertising employed by pharmaceutical makers. With respect to pharmaceutical prescriptions, direct-to-consumer advertising leads to more requests (and more prescriptions) for the advertised medications.23 This relationship is less clear with respect to direct-to-consumer marketing of orthopedic implants; however, we have noted multiple patient conversations related to YouTube video information.24 In this context, cognitive bias (in particular the anchoring effect) can impact decision making for patients.25 Similar to what is observed in patients with rotator cuff tears (the idea that the patient has a “tear” and that “it needs to be fixed”), these misconceptions can be reinforced by online information sources.26 Ultimately, the surgeon’s role is to aid in reorienting or redirecting patients with misconceptions regarding their DBT injury or treatment options; however, this can be more difficult in the face of poor online information. In addition to directing patients toward reliable online resources, professional organizations should aim to produce high-quality educational videos and distribute them on social media, where patients frequently seek information.

This investigation has a number of limitations that should be considered. First, our findings are specific to the content quality of YouTube videos, and it is uncertain whether these findings are generalizable to other online patient sources of information for DBT ruptures. A Google search or a query of patient-related American Academy of Orthopaedic Surgeons or American Society for Surgery of the Hand videos may have revealed additional videos. Second, limiting the search to a single date may have impacted our results, given the rapidly changing nature of the internet. Third, although the DISCERN instrument has been used in a number of prior investigations related to patient-education videos, this instrument was initially designed for written content sources as opposed to video.15,16,19 Prior reviews have identified high variability in terms of assessment methodology in prior studies of medically related YouTube videos, and it is uncertain how our results would have differed had we used alternative video scoring systems.27 Forth, although similar content scores have been used in prior investigations of YouTube videos, the DBCS is not a validated assessment and was not created from structured patient interviews or surveys. Rather, it contained content points that were deemed to be important by the investigators. Fifth, our investigation did not include an assessment of “readability” scores for these videos. Additionally, we did not assess the video’s intended target audience. It is possible that surgical technique videos or videos uploaded for surgeon training may have lower quality and reliability scores with respect to patients, since they were not the intended audience. Future prospective investigations should address the relationship between preconsultation online information, cognitive bias, and decision making, as well as assess in a systematic fashion where patients go to find educational materials at home.

In conclusion, the overall content, quality, and reliability of DBT videos on YouTube are poor. Videos from academic sources do not appear to provide higher-quality information than videos from nonacademic sources. Videos related to operative treatment of DBT ruptures favor single-incision techniques. Social media videos can function as direct-to-consumer marketing materials, and surgeons should be prepared to address potential misconceptions regarding the management of DBT tears. Patients are increasingly seeking health information online, and surgeons should direct patients toward more reliable and vetted sources of information.

Footnotes

Declaration of interests: No benefits in any form have been received or will be received related directly or indirectly to the subject of this article.

Supplementary Data

Appendix 1
mmc1.docx (13KB, docx)
Appendix 2
mmc2.docx (14.2KB, docx)

References

  • 1.Srinivasan R.C., Pederson W.C., Morrey B.F. Distal biceps tendon repair and reconstruction. J Hand Surg Am. 2020;45(1):48–56. doi: 10.1016/j.jhsa.2019.09.014. [DOI] [PubMed] [Google Scholar]
  • 2.Freeman C.R., McCormick K.R., Mahoney D., Baratz M., Lubahn J.D. Nonoperative treatment of distal biceps tendon ruptures compared with a historical control group. J Bone Joint Surg Am. 2009;91(10):2329–2334. doi: 10.2106/JBJS.H.01150. [DOI] [PubMed] [Google Scholar]
  • 3.Ford S.E., Andersen J.S., Macknet D.M., Connor P.M., Loeffler B.J., Gaston R.G. Major complications after distal biceps tendon repairs: retrospective cohort analysis of 970 cases. J Shoulder Elbow Surg. 2018;27(10):1898–1906. doi: 10.1016/j.jse.2018.06.028. [DOI] [PubMed] [Google Scholar]
  • 4.Grewal R., Athwal G.S., Macdermid J.C., et al. Single versus double-incision technique for the repair of acute distal biceps tendon ruptures: a randomized clinical trial. J Bone Joint Surg Am. 2012;94(13):1166–1174. doi: 10.2106/JBJS.K.00436. [DOI] [PubMed] [Google Scholar]
  • 5.Hasan S.A., Cordell C.L., Rauls R.B., Bailey M.S., Sahu D., Suva L.J. Two-incision versus one-incision repair for distal biceps tendon rupture: a cadaveric study. J Shoulder Elbow Surg. 2012;21(7):935–941. doi: 10.1016/j.jse.2011.04.027. [DOI] [PubMed] [Google Scholar]
  • 6.Hansen G., Smith A., Pollock J.W., et al. Anatomic repair of the distal biceps tendon cannot be consistently performed through a classic single-incision suture anchor technique. J Shoulder Elbow Surg. 2014;23(12):1898–1904. doi: 10.1016/j.jse.2014.06.051. [DOI] [PubMed] [Google Scholar]
  • 7.Rao A.J., Dy C.J., Goldfarb C.A., Cohen M.S., Wysocki R.W. Patient preferences and utilization of online resources for patients treated in hand surgery practices. Hand (N Y) 2019;14(2):277–283. doi: 10.1177/1558944717744340. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Dy C.J., Taylor S.A., Patel R.M., Kitay A., Roberts T.R., Daluiski A. The effect of search term on the quality and accuracy of online information regarding distal radius fractures. J Hand Surg Am. 2012;37(9):1881–1887. doi: 10.1016/j.jhsa.2012.05.021. [DOI] [PubMed] [Google Scholar]
  • 9.Hadden K., Prince L.Y., Schnaekel A., Couch C.G., Stephenson J.M., Wyrick T.O. Readability of patient education materials in hand surgery and health literacy best practices for improvement. J Hand Surg Am. 2016;41(8):825–832. doi: 10.1016/j.jhsa.2016.05.006. [DOI] [PubMed] [Google Scholar]
  • 10.Wang S.W., Capo J.T., Orillaza N. Readability and comprehensibility of patient education material in hand-related web sites. J Hand Surg Am. 2009;34(7):1308–1315. doi: 10.1016/j.jhsa.2009.04.008. [DOI] [PubMed] [Google Scholar]
  • 11.Grandizio L.C., Pavis E.J., Caselli M.E., et al. Technology, social media, and telemedicine utilization for rural hand and upper-extremity patients. J Hand Surg Am. 2021;46(4):301–308.e1. doi: 10.1016/j.jhsa.2020.11.019. [DOI] [PubMed] [Google Scholar]
  • 12.YouTube YouTube for press. http://www.youtube.com/yt/press/statistics.html
  • 13.Cassidy J.T., Fitzgerald E., Cassidy E.S., et al. YouTube provides poor information regarding anterior cruciate ligament injury and reconstruction. Knee Surg Sports Traumatol Arthrosc. 2018;26(3):840–845. doi: 10.1007/s00167-017-4514-x. [DOI] [PubMed] [Google Scholar]
  • 14.Celik H., Polat O., Ozcan C., Camur S., Kilinc B.E., Uzun M. Assessment of the quality and reliability of the information on rotator cuff repair on YouTube. Orthop Traumatol Surg Res. 2020;106(1):31–34. doi: 10.1016/j.otsr.2019.10.004. [DOI] [PubMed] [Google Scholar]
  • 15.Ng M.K., Emara A.K., Molloy R.M., Krebs V.E., Mont M., Piuzzi N.S. YouTube as a source of patient information for total knee/hip arthroplasty: quantitative analysis of video reliability, quality, and content. J Am Acad Orthop Surg. 2021;29(20):e1034–e1044. doi: 10.5435/JAAOS-D-20-00910. [DOI] [PubMed] [Google Scholar]
  • 16.Erdem M.N., Karaca S. Evaluating the accuracy and quality of the information in kyphosis videos shared on YouTube. Spine. 2018;43(22):E1334–E1339. doi: 10.1097/BRS.0000000000002691. [DOI] [PubMed] [Google Scholar]
  • 17.Silberg W.M., Lundberg G.D., Musacchio R.A. Assessing, controlling, and assuring the quality of medical information on the internet: caveant lector et viewor—let the reader and viewer beware. JAMA. 1997;277(15):1244–1245. [PubMed] [Google Scholar]
  • 18.Charnock D., Shepperd S., Needham G., Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53(2):105–111. doi: 10.1136/jech.53.2.105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Kunze K.N., Cohn M.R., Wakefield C., et al. YouTube as a source of information about the posterior cruciate ligament: a content-quality and reliability analysis. Arthrosc Sports Med Rehabil. 2019;1(2):109–114. doi: 10.1016/j.asmr.2019.09.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Geller H.I., Singh A., Alexander K.M., Mirto T.M., Falk R.H. Association between ruptured distal biceps tendon and wild-type transthyretin cardiac amyloidosis. JAMA. 2017;318(10):962–963. doi: 10.1001/jama.2017.9236. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Ademiluyi G., Rees C.E., Sheard C.E. Evaluating the reliability and validity of three tools to assess the quality of health information on the Internet. Patient Educ Couns. 2003;50(2):151–155. doi: 10.1016/s0738-3991(02)00124-6. [DOI] [PubMed] [Google Scholar]
  • 22.Kunze K.N., Krivicich L.M., Verma N.N., Chahla J. Quality of online video resources concerning patient education for the meniscus: a YouTube-based quality-control study. Arthroscopy. 2020;36(1):233–238. doi: 10.1016/j.arthro.2019.07.033. [DOI] [PubMed] [Google Scholar]
  • 23.Mintzes B., Barer M.L., Kravitz R.L., et al. How does direct-to-consumer advertising (DTCA) affect prescribing? A survey in primary care environments with and without legal DTCA. CMAJ Can Med Assoc J. 2003;169(5):405–412. [PMC free article] [PubMed] [Google Scholar]
  • 24.Schaffer J.L., Bozic K.J., Dorr L.D., Miller D.A., Nepola J.V. AOA symposium: direct-to-consumer marketing in orthopaedic surgery: boon or boondoggle? J Bone Joint Surg. 2008;90(11):2534–2543. doi: 10.2106/JBJS.G.00309. [DOI] [PubMed] [Google Scholar]
  • 25.Janssen S.J., Teunis T., Ring D., Parisien R.C. Cognitive biases in orthopaedic surgery. J Am Acad Orthop Surg. 2021;29(14):624–633. doi: 10.5435/JAAOS-D-20-00620. [DOI] [PubMed] [Google Scholar]
  • 26.Malliaras P., Rathi S., Burstein F., et al. ‘Physio’s not going to repair a torn tendon’: patient decision-making related to surgery for rotator cuff related shoulder pain. Disabil Rehabil. 2021;1:1–8. doi: 10.1080/09638288.2021.1879945. [DOI] [PubMed] [Google Scholar]
  • 27.Drozd B., Couvillon E., Suarez A. Medical YouTube videos and methods of evaluation: literature review. JMIR Med Educ. 2018;4(1):e3. doi: 10.2196/mededu.8527. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Appendix 1
mmc1.docx (13KB, docx)
Appendix 2
mmc2.docx (14.2KB, docx)

Articles from Journal of Hand Surgery Global Online are provided here courtesy of Elsevier

RESOURCES