Skip to main content

Assessment of reliability and information quality of YouTube videos about root canal treatment after 2016

Abstract

Background

This study aimed to assess and update the content, reliability, and information quality of content related to root canal treatment (RCTx) on YouTube and evaluate the correlation between each evaluation index.

Methods

YouTube was searched using two terms related to RCTx (“root canal and endodontic treatment”). A total of 240 videos (120 for each search term) were screened. Exclusion criteria were as follows: no sound or visuals, non-English, irrelevant to the search term, longer than 15 min, duplicate, or old (uploaded before 2016). After exclusion, 50 videos of “root canal treatment” and 45 videos of “endodontic treatment” were analyzed. Video length, total number of views, likes, dislikes, comments, and days since upload were recorded using descriptive video data. Viewers’ interaction, reliability and information quality of the video, and quality of video content were measured using nondescriptive video data. The interaction index and video power index were used for viewer interactions, and the modified DISCERN index, JAMA criteria, and Global Quality Score were used to assess the reliability and information quality of the video. The quality of the video content was measured using the completeness score.

Results

The videos of the “root canal treatment” group had a significantly higher completeness score for the etiology and symptoms (p < 0.05), and videos of the “endodontic treatment” group showed a higher interaction index, completeness score for the procedure (p < 0.05). Videos for dentists had significantly higher completeness scores for the procedure, while videos for laypersons had higher completeness scores for etiology, anatomy, symptoms, and prognosis (p < 0.05). Furthermore, the total completeness score and the interaction index of the videos for laypersons were significantly higher (p < 0.05). The videos uploaded by the university had a significantly higher modified DISCERN index (p = 0.044), and the JAMA score was significantly higher in the commercial group (p = 0.001).

Conclusions

Although the accuracy of videos related to RCTx was higher in videos by universities and professionals, the total completeness of YouTube videos was low regardless of the video source. Therefore, professionals should be responsible for providing more accurate and reliable videos.

Peer Review reports

Background

Conventionally, health-related information is provided through direct communication with health care providers. However, with the growth of information technology, access to the Internet has become easier than in the past, and as a result, 80% of Internet users obtain medical or dental information through online searches [1].

YouTube is the most widely used content hosting site where users can freely upload videos, and is the second most visited website after Google [2]. YouTube videos are played more than 5 billion times daily, with an average viewing time of at least 15 minutes a day. Every minute, more than 500 h of new content are uploaded to YouTube [3]. Importantly, YouTube videos can be uploaded without exact verification and can be accessed by anyone with an account. Therefore, videos must be evaluated because inaccurate or misleading information can be provided. There are several video evaluation tools to assess the reliability and educational quality, such as the modified DISCERN, Journal of American Medical Association (JAMA) score, and Global Quality Score (GQS).

Root canal treatment (RCTx), a very common dental procedure, preserves natural teeth by removing bacteria and cleaning the infected root canal to prevent reinfection. It is difficult to know how many root canal treatments are performed in actual dental procedures, but data from the National Health Insurance Company indicate that RCTx is performed on hundreds of thousands of teeth every year in the USA [4]. As the procedure is practiced considerably, there is no doubt that many people will search for this purpose on YouTube. Several studies have evaluated YouTube videos in the field of endodontics, such as instrument separation, pulpotomy, and pulp capping [5, 6]. One study evaluated the completeness of RCTx videos uploaded to YouTube for patients [7], but no studies have assessed the popularity, reliability, and quality of videos regarding RCTx for dentists and laypersons. Additionally, ongoing video evaluation is essential, as many YouTube videos are uploaded daily.

Therefore, the purpose of this study was to assess and update the reliability and information quality of content related to the RCTx on YouTube and to evaluate the correlation between each scoring tool.

Methods

Video selection

On November 13, 2021, we searched YouTube for relevant videos using two related search terms: root canal treatment and endodontic treatment. The search was carried out by deleting cookies and caching with Google Chrome, using the default settings without filters. Previous studies have shown that 90% of people watch the top 60 videos several times a day [2, 8]; the top 120 videos were selected and screened according to each search term (a total of 240 videos).

Since the number of YouTube users has increased significantly compared to 2016 when a previous study assessing YouTube content on RCTx was reported [7], videos uploaded after 2016 were chosen. The exclusion criteria were as follows: no sound or visuals, non-English, irrelevant to the search term, longer than 15 min, duplicate, or old (uploaded before 2016). Previous studies have shown that videos longer than 15 min are less likely to attract YouTube users [5, 6, 9], and videos shorter than 15 min were selected. Ethical committee approval was waived, as this study was conducted using published online videos.

Video assessment

A second-year resident evaluated the videos and their characteristics in the endodontic department. Thirty videos for each search term, which were randomly selected, were reanalyzed by the same observer to evaluate intra-observer agreement at a 2 month interval. The following features of the video were recorded: length of the video, total number of views, likes, dislikes, comments, and days since upload.

The video upload sources were divided into five categories: 1) dentists, 2) specialists, 3) commercials, 4) universities, and 5) others. The videos were divided into two groups depending on the subject: 1) information for dentists and 2) information for laypersons. The video forms were categorized into three groups: 1) real procedure, 2) clinician explanation, and 3) animation. If the video forms overlapped, the main video form was selected.

After categorization of the videos, the viewers’ interests were calculated using the interaction index ([number of likes-number of dislikes] / number of total views × 100%) [5], video power index (VPI) ([view ratio × like ratio / 100]) [10], where view ratio = the number of total views/days since upload and like ratio = [(likes × 100) / (likes + dislikes)].

To assess the content of the videos, the investigator assessed the completeness of each video by numerically scoring (on a scale of 0–2: 0, not mentioned; 1, briefly introduced; 2, introduced in detail) for all six contents of “etiology,” “anatomy,” “symptoms,” “procedure,” “postoperative course,” and “prognosis” with a maximum score of 12 [7].

The three evaluation tools used in this study are illustrated in Fig. 1. A modified DISCERN index [11] was selected as a scoring system to evaluate reliability and accuracy using a five-point scale. Each point was allocated for concision, reliability, balance, reference, and uncertainty, with higher scores indicating greater reliability. Similarly, video reliability was evaluated using the JAMA criteria [12]. The JAMA criteria collectively evaluate authorship, attribution, disclosure, and currency. Each criterion was scored (on a scale of 0–1) and the JAVA score was calculated as a total score ranging 0–4, with a maximum score of 4 indicating the highest video reliability.

Fig. 1
figure 1

Modified DISCERN, JAMA and GQS. JAMA, Journal of American Medical Association; GQS, Global Quality Score

The quality of educational information was assessed using the GQS [13], with a range of 1–5; where 5 indicates that the quality of information is excellent for the viewer.

Statistical analysis

SPSS Statistics software (version 16.0; SPSS, Inc., Chicago, IL, USA) was used for statistical analysis. The Shapiro-Wilk and Kolmogorov-Smirnov tests were used to analyze the normality of quantitative data (length of the video, number of total views, likes, dislikes, comments, days since upload, interaction index, VPI, modified DISCERN index, JAMA score, and GQS), and it was shown that the parameters did not represent a normal distribution (p < 0.05). The comparison between the two groups for continuous variables with nonnormal distribution was analyzed with the Mann-Whitney test, and the Kruskal-Wallis test was used to compare three or more groups. Pearson’s correlation coefficient was used to examine possible correlations of completeness score, modified DISCERN index, JAMA score, GQS, interaction index, and VPI.

The intraobserver agreement of the rating scores was assessed using intraclass correlation; a value of > 0.8, 0.6–0.8, 0.4–0.6, 0.2–0.4, and < 0.2 represent “excellent” agreement, “very good,” “good,” “questionable,” and “unacceptable,” respectively. Statistical significance was set at p < 0.05.

Results

In this study, a total of 240 videos (120 for each search term) were screened; however, 70 videos of “root canal treatment” and 75 videos of “endodontic treatment” were excluded. The reasons for the video exclusion are listed in Table 1. After screening, 95 videos were analyzed. According to the intraclass correlation, the intraobserver agreement of rating scores between the two evaluation times was 0.856 and 0.810 for “root canal treatment” and “endodontic treatment,” respectively. Descriptive data of the search term, including length of the video, number of total views, likes, dislikes, comments, days since upload, interaction index, and VPI, are shown in Table 2. The average duration of all videos was 320 s. The average number of total views, likes, dislikes, and comments for the entire video was 211,905, 1065, 83, and 160, respectively. The mean number of days since uploading was 895 days. “root canal treatment” videos had a significantly higher number of comments (p < 0.05), and “endodontic treatment” videos had a significantly longer duration of videos (p < 0.05). However, the number of total views, likes, and dislikes was not significantly different between the search terms (p > 0.05).

Table 1 Video exclusion reasons for each search term
Table 2 Descriptive data by search term

The characteristics of the video, including the source, subject, and form, are summarized in Table 3. Most of the videos were uploaded by a dentist or specialist (> 70%). The main subject of the video search for “root canal treatment” was information for laypersons (60%), and “endodontic treatment” was for dentists (82%). Finally, the video form of “root canal treatment” is a clinician’s explanation (38%), followed by a real procedure (34%) and animation (28%), and the most video form of “endodontic treatment” is the real procedure (62%).

Table 3 Video characteristics

The viewers’ interaction, completeness, and reliability scores by search terms are presented in Table 4. Evaluations based on video search terms showed that “root canal treatment” had significantly higher etiology and symptom scores (p < 0.05), and “endodontic treatment” had a higher interaction index and procedure score (p < 0.05). However, there were no significant differences in VPI, completeness score (anatomy, postoperative course, and prognosis), modified DISCERN index, JAMA score, and GQS (p > 0.05).

Table 4 Interaction index, VPI, completeness score, and reliability scores by search term

The evaluation of completeness, viewers’ interaction, and reliability scores regarding the video source, video subject, and video form is reported in Tables 5, 6, and 7. According to the video form, the procedure score in the real procedure (p < 0.05), etiology, symptoms, and prognosis scores in clinical explanation (p < 0.05), and anatomy score in animation (p = 0.024) were significantly higher. Videos for dentists had significantly higher procedure scores, while videos for laypersons revealed a higher etiology, anatomy, symptoms, and prognosis score (p < 0.05). Furthermore, the total completeness score and interaction index of the videos for laypersons were significantly higher (p < 0.05). Uploaded videos by the university had a significantly higher modified DISCERN index (p < 0.05), and the JAMA score was significantly higher by commercial (p < 0.05). The correlations of all scores that involve total completeness, VPI, interaction index, modified DISCERN index, JAMA score, and GQS are shown in Table 8. The total completeness score showed a high correlation with the GQS (r = 0.654) and the modified DISCERN index (r = 0.676), and the GQS revealed a high correlation with the modified DISCERN index (r = 0.728). The JAMA score showed a low correlation with the modified DISCERN index (r = 0.264).

Table 5 Completeness score by video source, video subject, and video form
Table 6 Total completeness score, interaction index, and VPI by video source, video subject, and video form
Table 7 Reliability scores by video source, video subject, and video form
Table 8 Correlation of the scores (r)

Discussion

Although dentists and laypersons have different purposes for finding information, they are interested in RCTx, commonly performed in dental procedures. Several sources, such as papers, conferences, Internet searches, and social media, provide information on RCTx. YouTube, the second most visited website and can be easily accessed, often provides information to people who want to know about health-related content, and people also upload their opinions. As YouTube content is not evaluated by experts in the relevant field, the reliability and educational quality of health content has become an important issue [14]. The availability and precision of the information on YouTube have been questioned, as the accuracy of the uploaded video cannot be confirmed [15]. Due to its easy accessibility, users can be provided with useful and misleading information. One study reported that 33% of Internet users believed that online medical information is accurate [16]. Therefore, it is necessary to evaluate the videos uploaded to YouTube. As a result, several studies on dentistry, such as bleaching, dental trauma, early childhood caries, clear orthodontic aligners, and cleft lip, have been conducted on YouTube [17,18,19,20,21].

In this study, the search terms were selected from a wide range of terminologies (“root canal treatment” and “endodontic treatment”). This is because if the search term becomes more detailed, the contents related to a specific procedure can be focused.

In the analysis of the descriptive data and characteristics of the videos in this study, the most viewed video was over 5 million, but there were also videos with less than 100 views. Although older videos are generally expected to have higher views, there was no correlation between total views and dates in this study (r = v-0.07). The number of likes and dislikes was positively correlated with the total number of views (r > 0.09). There were no differences in the video sources according to the search terms. When YouTube was searched using the term “endodontic treatment,” the main subjects of the videos were dentists (82%), and the majority of the video forms were actual clinical procedures (62%). The average length of the video was also significantly longer than that of videos searched using the term “endodontic treatment.” This means that videos of the search term “endodontic treatment” contain more details on the RCTx procedure than “root canal treatment.”

Several studies have evaluated the popularity of videos using viewing rate, interaction index, and VPI [22, 23]. In this study, the interaction index and VPI were used to assess the popularity of videos. The reason for using the VPI instead of the viewing rate is because it reflects a similar ratio to the viewing rate [24]. The interaction index was significantly higher in the “endodontic treatment” (1.45 ± 1.52) than in the “root canal treatment” (0.75 ± 0.58), which was almost twice as high. This means that viewers are more interested when searching for “endodontic treatment” than “root canal treatment.”

Previous study regarding RCTx content on YouTube [7] reported that the completeness score in the “root canal treatment” group was the highest in the procedure (1.15 ± 0.67), followed by etiology (0.80 ± 0.62), symptoms (0.65 ± 0.81), postoperative course (0.65 ± 0.75), anatomy (0.50 ± 0.69), and prognosis (0.25 ± 0.44). Meanwhile, the score in the “endodontics” group was the highest in the procedure (0.75 ± 0.55), followed by anatomy (0.50 ± 0.61), postoperative course (0.2 ± 0.52), etiology (0.15 ± 0.36), symptoms (0.15 ± 0.49), and prognosis (0.1 ± 0.31). This is consistent with the results of this study, except that the completeness level of the anatomy has been improved. The significant difference in the current study between the two search terms in the completeness of etiology, symptoms, and procedure indicates that “root canal treatment” videos were made mainly for the general public who wondered why they should be treated and what the symptoms were like, and “endodontic treatment” videos were mainly uploaded to dentists who wanted to know the specific treatment procedure.

The average overall completeness score for this study was 4 ± 2.33, showing a low level of approximately 1/3 of the total. This means that the video information related to RCTx is insufficient and may provide YouTube viewers with incorrect information. Furthermore, the completeness scores of the video content by upload source were not significantly different (p > 0.05). This was inconsistent with the results of previous studies, in which videos by professionals were more beneficial [25, 26]. Since videos by professionals focus on the procedure itself rather than basic knowledge, it is believed that the completeness of videos by professionals is not greater. Therefore, other evaluation tools are required to analyze professionalism.

Three video evaluation scoring tools were used for objective analysis. The JAMA score focuses on evaluating the reliability of the video, while the modified DISCERN index evaluates both accuracy and reliability. Additionally, the GQS is a gradation of the educational degree of a video. In this study, the mean of the modified DISCERN index by video source was 2.57 ± 1.13, 2.41 ± 0.98, and 1.93 ± 1.27 for universities, specialists, and dentists, respectively, which were significantly higher than those of the non-professional (p < 0.05). Videos by professionals were considered to contain clearer references, which had higher accuracy. Furthermore, the JAMA score of the commercial video (3.67 ± 0.52) was the highest, which means that more reliable information was included in the video to enhance the advertisement effect.

In the current study, the correlation between each evaluation tool was analyzed. The total completeness score, modified DISCERN index, and GQS score showed a high correlation (0.6 ≤ r < 0.8). The interaction index and VPI, which indicate the popularity of a video, were not correlated with other video rating tools. This means that even if the video is popular and has many viewers, the quality of the video may not be good (− 0.2 < r < 0.2). However, more studies are needed because few studies have investigated the correlation between video evaluation indices, and the standardized evaluation tool for comprehensive video evaluation remains unclear. A limitation of this study was that only the top 120 videos were analyzed, so not all videos on YouTube regarding root canal treatment could be evaluated. In addition, due to the features of YouTube, many videos are continuously uploaded, and only analysis at a specific point in time is possible. Therefore, an ongoing evaluation of YouTube videos is required. Finally, in addition to YouTube, an additional evaluation of information on social media used by many people, such as Twitter, Facebook, and Instagram, is also necessary, and these previous evaluation tools can be used for analysis.

Conclusion

Although the accuracy of videos related to RCTx was higher in videos by universities and professionals, the total completeness of YouTube videos was low regardless of the video source. Therefore, it is important for professionals to provide more accurate and reliable videos to reduce the risk of misinformation by viewers. In addition, as there is no standardized index for evaluating YouTube videos, future studies should be directed toward the development of new evaluation tools for more objective and comprehensive video evaluation.

Availability of data and materials

The datasets used and analyzed during the study are available from the corresponding author upon reasonable request.

Abbreviations

RCTx:

Root canal treatment

JAMA:

Journal of American Medical Association

GQS:

Global Quality Score

VPI:

Video power index

References

  1. Hesse BW, Moser RP, Rutten LJ, Kreps GL. The health information national trends survey: research from the baseline. J Health Commun. 2006;11(Suppl 1):vii–xvi.

    Article  PubMed  Google Scholar 

  2. Desai T, Shariff A, Dhingra V, Minhas D, Eure M, Kats M. Is content really king? An objective analysis of the public's response to medical videos on YouTube. PLoS One. 2013;8:e82469.

    Article  PubMed  PubMed Central  Google Scholar 

  3. GMI. YouTube user statistics 2022. https://www.globalmediainsight.com/blog/youtube-users-statistics/ Accessed on 10 Jan 2022.

  4. Salehrabi R, Rotstein I. Epidemiologic evaluation of the outcomes of orthograde endodontic retreatment. J Endod. 2010;36:790–2.

    Article  PubMed  Google Scholar 

  5. Ozbay Y, Cirakoglu NY. YouTube as an information source for instrument separation in root canal treatment. Restor Dent Endod. 2021;46:e8.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Kodonas K, Fardi A. YouTube as a source of information about pulpotomy and pulp capping: a cross sectional reliability analysis. Restor Dent Endod. 2021;46:e40.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Nason K, Donnelly A, Duncan HF. YouTube as a patient-information source for root canal treatment. Int Endod J. 2016;49:1194–200.

    Article  PubMed  Google Scholar 

  8. Lo AS, Esser MJ, Gordon KE. YouTube: a gauge of public perception and awareness surrounding epilepsy. Epilepsy Behav. 2010;17:541–5.

    Article  PubMed  Google Scholar 

  9. Hassona Y, Taimeh D, Marahleh A, Scully C. YouTube as a source of information on mouth (oral) cancer. Oral Dis. 2016;22:202–8.

    Article  PubMed  Google Scholar 

  10. Radonjic A, Fat Hing NN, Harlock J, Naji F. YouTube as a source of patient information for abdominal aortic aneurysms. J Vasc Surg. 2020;71:637–44.

    Article  PubMed  Google Scholar 

  11. Kilinc DD, Sayar G. Assessment of reliability of YouTube videos on orthodontics. Turk J Orthod. 2019;32:145–50.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor--Let the reader and viewer beware. J Am Med Assc. 1997;277:1244–5.

    Article  Google Scholar 

  13. Bernard A, Langille M, Hughes S, Rose C, Leddin D, Veldhuyzen Van Zanten S. A systematic review of patient inflammatory bowel disease information resources on the world wide web. Am J Gastroenterol. 2007;102:2070–7.

    Article  PubMed  Google Scholar 

  14. Duman C. YouTube quality as a source for parent education about the oral hygiene of children. Int J Dent Hyg. 2020;18:261–7.

    Article  PubMed  Google Scholar 

  15. Madathil KC, Rivera-Rodriguez AJ, Greenstein JS, Gramopadhye AK. Healthcare information on YouTube: a systematic review. Health Inform J. 2015;21:173–94.

    Article  PubMed  Google Scholar 

  16. Nason GJ, Tareen F, Quinn F. Hydrocele on the web: an evaluation of internet-based information. Scand J Urol. 2013;47:152–7.

    Article  PubMed  Google Scholar 

  17. Simsek H, Buyuk SK, Cetinkaya E, Tural M, Koseoglu MS. "how I whiten my teeth": YouTube as a patient information resource for teeth whitening. BMC Oral Health. 2020;20:183.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Tozar KN, Yapici YG. Reliability of information on YouTube regarding pediatric dental trauma. Dent Traumatol. 2021;37:772–8.

    Article  PubMed  Google Scholar 

  19. Elkarmi R, Hassona Y, Taimeh D, Scully C. YouTube as a source for parents' education on early childhood caries. Int J Paediatr Dent. 2021;27:437–43.

    Article  Google Scholar 

  20. Ustdal G, Guney AU. YouTube as a source of information about orthodontic clear aligners. Angle Orthod. 2020;90:419–24.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Korkmaz YN, Buyuk SK. YouTube as a patient-pnformation source for cleft lip and palate. Cleft Palate Craniofac J. 2020;57:327–32.

    Article  PubMed  Google Scholar 

  22. Gas S, Zincir OO, Bozkurt AP. Are YouTube videos useful for patients interested in botulinum toxin for bruxism? J Oral Maxillofac Surg. 2019;77:1776–83.

    Article  PubMed  Google Scholar 

  23. Erdem MN, Karace S. Evaluating the accuracy and quality of the information in kyphosis videos shared on YouTube. Spine (Phila Pa 1976). 2018;43:E1334–9.

    Article  Google Scholar 

  24. Ferhatoglu MF, Kartal A, Ekici U, Gurkan A. Evaluation of the reliability, utility, and quality of the information in sleeve gastrectomy videos shared on open access video sharing platform YouTube. Obes Surg. 2019;29:1477–84.

    Article  PubMed  Google Scholar 

  25. Ng CH, Lim GRS, Fong W. Quality of English-language videos on YouTube as a source of information on systemic lupus erythematosus. Int J Rheum Dis. 2020;23:1636–44.

    Article  PubMed  Google Scholar 

  26. Yagci F. Evaluation of YouTube as an information source for denture care. J Prosthet Dent. 2021;S0022-3913(21)00364–4. https://doi.org/10.1016/j.prosdent.2021.06.045. Epub ahead of print.

Download references

Acknowledgements

Not applicable.

Funding

This study was supported by Wonkwang University in 2022.

Author information

Authors and Affiliations

Authors

Contributions

JM and SM designed the study. JM gathered the information, performed statistical analysis, and evaluated the results. JM and SM wrote, reviewed, and proofread the manuscript. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Min-Seock Seo.

Ethics declarations

Ethics approval and consent to participate

This article does not contain any studies with human participants or animals performed by any of the authors. No ethical committee approval is required since this study is performed on the publicly available Internet data. For this type of study, formal consent is not required.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jung, Mj., Seo, MS. Assessment of reliability and information quality of YouTube videos about root canal treatment after 2016. BMC Oral Health 22, 494 (2022). https://doi.org/10.1186/s12903-022-02540-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12903-022-02540-4

Keywords