Abstract
Objective
Oral and pharyngeal cancers are responsible for over 7,600 deaths each year in the United States. Given the significance of the disease and the fact that many individuals increasingly rely on health information on the Internet, it is important that patients and others can access clear and accurate oral cancer information on the Web. The objective of this study was threefold: a) develop an initial method to evaluate surface and content quality of selected English- and Spanish-language oral cancer Web sites; b) conduct a pilot evaluation; and c) discuss implications of our findings for dental public health.
Methods
We developed a search strategy to find oral cancer sites frequented by the public using Medline Plus, Google, and Yahoo in English and Spanish. We adapted the Information Quality Tool (IQT) to perform a surface evaluation and developed a novel tool to evaluate site content for 24 sites each in English and Spanish.
Results
English-language sites had an average IQT score of 76.6 (out of 100) and an average content score of 52.1 (out of 100). Spanish-language sites had an average IQT score of 50.3 and an average content score of 25.6.
Conclusions
The study produced a quality assessment of oral cancer Web sites useful for clinicians and patients. Sites provided more information on clinical presentation, and etiology, and risk factors, than other aspects of oral cancer. The surface and quality of Spanish-language sites was low, possibly putting Hispanic populations at a disadvantage regarding oral cancer information on the Web.
Keywords: dental public health, information storage and retrieval, Internet, dental informatics, mouth neoplasms, humans
Introduction
The Internet is an increasingly important medium for the delivery of public health interventions (1). The high potential reach of the World Wide Web specifically offers the possibility of affecting large populations. As of 2006, 80 percent of American adult Internet users searched the Web for general health information, and 15 percent of those specifically looked for dental health information (2). While health information on the Internet appears to be largely trusted (1), the variable quality of Internet-based health information is a significant concern (3).
Health-related information in peer-reviewed journals is scrutinized by a thorough process before information is disseminated. However, most health information on the Web available to consumers and patients does not undergo such a process (4). While one can assume that content providers on the Web are trying to present high-quality information, healthcare consumers and providers must be aware of potential variations in the quality of the information offered (3,5).
Floridi (6) suggests three basic requirements for quality information: a) the information is presented in a manner free from propaganda or disinformation (Objectivity); b) the information is a complete, not a partial picture of the subject (Completeness); and c) all aspects of the information are given and are not restricted to present a particular viewpoint (Pluralism). Policy makers and advocacy groups have tried to provide universally applicable rating schemes and coding systems to allow providers and consumers to assess the quality of the information they are finding on the Web (7–9). The Health on the Net Foundation, for instance, provides mechanisms to certify and locate high-quality health information Web sites (7). The foundation provides a “Code of Conduct” for health Web sites as well as two medical search tools, MedHunt and HONselect, to help users find trustworthy information (7). However, in evaluating 47 rating instruments, Jadad and Gagliardi concluded that “[i]t is unclear … whether they should exist in the first place, whether they measure what they claim to measure, or whether they lead to more good than harm” because none of the Web site rating tools provided measurements of reliability and/or validity (10). In addition, consumers in general are unlikely to use evaluation instruments (10) or any of their criteria (11).
Quality information is exceedingly important for those searching for information about life-threatening diseases. In the United States alone, roughly 7,600 people die each year from oral and pharyngeal cancer, and more than 35,000 new cases are diagnosed annually (12). Furthermore, oral cancer is one of the most common forms of cancer in Hispanic American males (13). Given current and expected demographics of the US population, as well as the relatively static nature of oral cancer morbidity and mortality, the availability of high-quality information in both English and Spanish via the Web is important from a public health perspective. Previous studies indicate that online information pertaining to cancer can be erroneous, outdated, and complex (4,14,15).
Since many patients are unlikely to be very discriminating about the quality of health information on the Web, a logical focus for improving Web-based health information from a public health perspective is on-site providers. We need to develop processes for assessing the quality, and, where appropriate, provide information to help providers improve their Web sites.
Therefore, the objective of this study was threefold. We first developed an initial approach to evaluate surface quality and content of selected English- and Spanish-language oral cancer Web sites most likely frequented by consumers. Second, we conducted a pilot evaluation using our method. Last, we identified potential dental public health implications of our findings.
Methods
Identification of Web sites
Search strategy
We designed our search strategy to model how consumers search for oral cancer information on the Web. Consumers, in general, limit their searches to general-purpose search engines instead of specific health-related ones (2,11). Our search strategy was designed to find oral cancer Web sites for English- and Spanish-speaking consumers, employing search engines typically used by these groups. It is important to note that we did not search for identical Web sites in both English and Spanish but completed two different searches that yielded sites most likely to be used by the respective language group.
Search in MedlinePlus
We began our search for oral cancer Web sites with the MedlinePlus portal. MedlinePlus is a Web site produced by the National Library of Medicine, designed to direct consumers to valid and high-quality health information. It provides links to government and nonprofit organizations' Web sites which contain information on hundreds of health topics in both English and Spanish (16). In 2008, MedlinePlus reported having over 130 million unique visitors (16), with 26–45 percent of those being consumers, 11–17 percent providers, and the rest researchers (17). Based on these data, we considered MedlinePlus an appropriate starting point for identifying oral cancer Web sites. We reviewed MedlinePlus's “oral cancer” and Español “cáncer oral” pages. We explored every link found on each page and added any sites that provided information on oral cancer to our dataset.
Search on Google and Yahoo
Next, we supplemented our dataset with oral cancer sites found using Google and Yahoo, search engines that consumers often use first to find health information on the Web (2,11). Attempting to emulate a consumer search, we choose the keywords “oral cancer,” “mouth cancer,” and “tongue cancer” to obtain additional English-language sites. For locating Spanish-language sites, a native Spanish-speaking dentist translated our search terms into “cancer oral,” “cancer de la boca,” and “cancer de la lengua” and we searched Google español and Yahoo Telemundo. We reviewed the resulting Web sites from the first and second result pages (using default Google settings for displaying 10 hits per page) because most Web searchers typically only review the top ranked results (11). Using the inclusion/exclusion criteria described in the next section, we added any Web site providing information on oral cancer that did not duplicate a site already in our dataset.
Inclusion/exclusion criteria
Our goal was to evaluate Web sites intended to provide general information about oral cancer to the public. We defined those sites as Web pages that stated this goal explicitly (for instance, in sections such as “About this site”), presented information intended to educate individuals about oral cancer (such as “Causes of oral cancer” and “Treatment”), or implied a focus on consumers/patients (e.g., “I have oral cancer. What now?”). We excluded sites that were targeted at professionals, as evident from the language, such as complex medical terminology, and/or content, such as lists/discussions of the research literature on oral cancer; sites with a purely commercial focus, such as those selling products as their primary function; and sites with very limited information, such as only treatment, about oral cancer. The authors, supported by a Spanish-speaking dentist (for Spanish-language sites), and two health science librarians (both native speakers in English and Spanish, respectively), performed an initial review of the sites with regard to the inclusion/exclusion criteria and also included any major consumer oral cancer sites that they considered missing.
The search strategy for English-language Web sites produced 28 unique sites by combining the results from the MedlinePlus, Google, and Yahoo searches. The reviewers removed five sites; the librarian added one site that she felt should be reviewed. The remaining 24 English-language oral cancer sites constituted the final English dataset for the evaluation (see Table 1).
Table 1.
Average Surface Evaluation (IQT) Scores for English-Language Web sites. Scale is 0-100, with 100 being the Highest Possible Score. Sites are Sorted by Highest IQT Score. Calculation Errors due to Rounding
Weighted.
IQT, Information Quality Tool.
The search for Spanish-language sites produced 30 unique Web sites. The review resulted in the removal of five sites. During the site surface evaluation, one of the Spanish-language sites was available only intermittently which made it difficult to evaluate. Therefore, it was removed from the dataset. The remaining 24 Spanish-language oral cancer sites constituted the final Spanish dataset for the evaluation (see Table 2).
Table 2.
Average Surface Evaluation (IQT) Scores for Spanish-Language Web sites. Scale is 0-100, with 100 being the Highest Possible Score. Sites are Sorted by Highest IQT Score. Calculation Errors due to Rounding
Weighted.
IQT, Information Quality Tool.
Surface evaluation of sites
Assessment metric
In order to evaluate the surface features, including design and usability of each Web site, we used a modified version of the Information Quality Tool (IQT) developed by the Health Summit Working Group (18,19). The IQT is designed to assess sites according to the following four criteria: a) Disclosure: What was the purpose and intent of the producers of the sites?; b) Links: Were the links provided on the sites current and working?; c) Design: Was the site navigable and organized?; d) Perceived agenda: Did the site market services and products which can influence its agenda in providing health information? (18,19).
The original IQT contained 21 questions. We modified it for this study to establish an objective surface evaluation as follows: Four of the questions that dealt with currency and accuracy of information were removed because we considered these criteria part of the content evaluation and more appropriate for the experts to assess (see section on content evaluation of sites). Next, rather than asking raters to check every link on each site, we processed each site with the WC3 Link Checker (20). The WC3 Link Checker is a tool designed by the World Wide Web Consortium which processes sites to check for issues with links, anchors, and referenced objects. Based on the number of errors found by the link checker, raters answered the question with yes or no. Instead of expecting the raters to make subjective decisions about navigation and content organization, we asked them to process the site with the Watchfire Webxact Bobby tool (21). At the time of the study, the Watchfire Webxact Bobby tool, provided by the Centre for Applied Special Technology, could be used to validate Web sites based on the Web Accessibility Initiation (21). Based on the Watchfire score, the rater answered the question yes or no (The tool has since been acquired by IBM and has been significantly changed.). Last, the question “Does the search engine assist you in using the site?” was reframed to “Does the site have a site map?” This change was made to match the objectivity of the other IQT questions, which ask the rater to simply determine if items were present or absent. The final tool contained 17 yes/no questions. A copy of the modified IQT tool can be found at http://www.dentalinformatics.org/tools/oralcancer/.
Raters
Because the IQT is a general Web site assessment tool that does not require dental knowledge, we asked two English-speaking undergraduate students to rate the English-language sites, and two Spanish-speaking dental students to rate the Spanish-language sites. The raters were native speakers and received reimbursement for their efforts. None of the raters had any prior experience using the IQT tool or any specialized knowledge to assist them with the task. We trained each rater on how to answer the modified IQT questions.
Evaluation procedure
The raters reviewed each site individually and the IQT score was automatically generated from their answers. Cronbach's alpha and the intraclass correlation coefficient [at a 95 percent confidence interval (CI)] were calculated to measure inter-rater reliability.
Content evaluation of sites
Assessment metric
To assess content, we created a survey tool for oral cancer content de novo since no appropriate tool existed. We used an approach similar to that taken by other studies that have assessed cancer Web site content (4,15,22), and reviewed cancer-specific content rating tools to the degree they were available to us, such as reference (22). After reviewing two textbooks on oral cancer (23,24), we drafted a general outline of information categories which we subsequently developed into a pilot assessment tool. Three oral cancer experts reviewed, critiqued, and suggested improvements for the instrument. None of these experts were involved in the subsequent evaluation of the sites. Once the questionnaire was finalized, a native Spanish-speaking oral cancer expert translated it for use by Spanish-speaking experts. The final questionnaire contained 14 information categories: Epidemiology, Etiology and risk factors, Clinical pathology, Clinical presentation, Images, Diagnosis, Treatment, Rehabilitation, Prevention, Nutrition during treatment, Follow-up care after treatment, Clinical trial information, Support groups, and References to the literature.
For each category in the survey, we assessed a) presence (yes/no); b) coverage; c) correctness (both 4-point Likert scale: good, fair, poor, N/A); and d) currency (yes/no). Each answer choice carried a number of points (yes = 1, no = 0, good = 3, fair = 2, poor = 1, N/A = 0) which were totaled for each category and site, resulting in a score range of 0–293. We subsequently normalized scores to be between zero and 100. An online version of the tool can be found at http://www.dentalinformatics.org/tools/oralcancer/.
Raters
Since the goal of our study was to conduct an initial pilot test, not a full-scale measurement study, of the content evaluation tool, we used two raters, as similar studies have done (4,22). Based on our results, we plan to conduct an expanded study with a larger number of raters that includes a formal assessment of reliability and validity of the tool (25). Two English-speaking and two native Spanish-speaking oral cancer experts performed the content evaluation. Both Spanish-speaking experts are dentists who currently hold faculty positions specializing in oral cancer research at US dental schools. One English expert is a dentist and physician and is a professor of oral and maxillofacial surgery, while the other is a dentist directing a division of oral medicine. None of the expert raters had participated in any previous aspect of the study. All experts were reimbursed for their efforts.
Evaluation procedure
Each rater received the content evaluation instrument via email with written instructions and was asked to assess each site in their set, either English or Spanish, individually. Cronbach's alpha and the intraclass correlation coefficient (at a 95 percent CI) were calculated to measure inter-rater reliability. Lastly, we performed a Pearson correlation to compare the English surface evaluation scores with the English content scores. We performed the same analysis with the Spanish scores. While we present the results of evaluating sites in both languages side by side, our intent was not to directly compare the characteristics of English- and Spanish-language sites to each other. Rather, readers should consider language-specific results separately.
Results
Surface evaluation
The English-language Web sites averaged 588 individual Web pages per site containing an average of 1,437 links per site; Spanish-language sites had an average of 273 individual Web pages per site with an average of 169 links per site. The English-language sites had an average IQT score of 76.6 (of a maximum of 100) with inter-rater reliabilities of 0.87 (Cronbach's alpha) and 0.77 (intraclass coefficient, 95 percent CI, n = 24). The English-language site scores are listed in Table 1. The Spanish-language sites had an average IQT score of 50.3 with a Cronbach's alpha of 0.95 and a 0.91 intraclass coefficient at 95 percent CI, n = 24. The Spanish-language site scores are listed in Table 2. Scores for English-language sites ranged from 25 to 100 and Spanish-language site from 15 to 98. Furthermore, 79 percent of the English-language and 38 percent of the Spanish-language sites had a score of 60 or higher. Disclosure and Design had the highest impact on the total score when the IQT scores were broken down into four parts: Disclosure constitutes 42 percent of the score, Design 30 percent, Agenda 21 percent, and Links 7 percent. Of the four categories, the English-language sites achieved the highest average scores in the Design category and the lowest in the Agenda category. The Spanish-language sites achieved the highest average scores in the Links category and the lowest in the Agenda category.
Six organizations offered sites which appeared in both the English- and Spanish-language datasets, but in only one case, the National Library of Medicine's MedlinePlus, was the Spanish material a direct translation of the English site. IQT scores for the English and Spanish version were almost identical (100 and 98, respectively), suggesting a reliable measurement.
Content evaluation
We removed six Spanish-language sites from the content assessment because they were only available intermittently during the evaluation period, making it difficult for the raters to access them. In addition, we omitted one English-language site and three Spanish-language sites because the scores between raters were highly divergent. Therefore, we report content evaluation results for 23 English-language and 15 Spanish-language sites.
English-language site scores are shown in Table 3. The content score ranged from 11 to 94 and averaged 52.1 (of a maximum of 100), with inter-rater reliabilities of 0.80 (Cronbach's alpha) and 0.67 (intraclass coefficient, 95 percent CI, n = 23). Sixty-one percent of the sites had a score of 50 or higher. As described in the methods section, the content score was calculated from subscores for coverage, correctness, and currency, with averages of 46.9 percent, 56.3 percent, and 60.3 percent, respectively. While no site provided complete coverage of all oral cancer topics, the top five sites scored between 73 percent and 89 percent. A qualitative assessment of the subscores shows that high coverage scores were typically associated with high correctness and currency scores. However, we did not test this finding statistically.
Table 3.
Average Content Evaluation Scores for English-language Web sites. Scores were Normalized to 0-100. Sites are Sorted by Highest Content Score. Calculation Errors due to Rounding
| Organization | Content score | % Coverage* | % Correctness* | % Currency* |
|---|---|---|---|---|
| The Ohio State University Medical Center | 94 | 89 | 99 | 99 |
| National Cancer Institute | 90 | 82 | 98 | 99 |
| American Cancer Society | 86 | 80 | 93 | 96 |
| New York Online Access to Health | 82 | 73 | 91 | 93 |
| MedicineNet.com | 81 | 73 | 89 | 90 |
| The Cancer Information Network | 78 | 70 | 84 | 90 |
| National Library of Medicine | 77 | 68 | 85 | 90 |
| University of Maryland Medical Center | 67 | 60 | 73 | 81 |
| Tongue Cancer.com | 63 | 58 | 68 | 74 |
| MayoClinic.com | 61 | 52 | 67 | 76 |
| CDC | 56 | 53 | 59 | 63 |
| British Dental Health Foundation | 54 | 48 | 55 | 79 |
| Oral Cancer Foundation | 54 | 53 | 57 | 53 |
| Cleveland Clinic | 53 | 48 | 58 | 60 |
| University of Michigan Comprehensive Cancer Center | 38 | 29 | 47 | 50 |
| Mouth Cancer Foundation | 33 | 28 | 37 | 39 |
| Cancer Treatment Centers of America | 29 | 22 | 34 | 43 |
| Floss.com | 24 | 25 | 22 | 26 |
| UConn Health Center | 21 | 13 | 28 | 31 |
| Cancer Research UK | 19 | 18 | 20 | 24 |
| American Dental Association | 12 | 14 | 9 | 13 |
| Caring Medical & Rehabilitation Services | 11 | 13 | 11 | 7 |
| Brigham AndWomen's Hospital | 11 | 11 | 11 | 14 |
| Averages | 52.1 | 46.9 | 56.3 | 60.3 |
Weighted.
Spanish-language individual site scores can be found in Table 4. Scores ranged from 10 to 56 and averaged 25.6 (of a maximum of 100), with a Cronbach's alpha of 0.79 and a 0.63 intraclass coefficient at 95 percent CI, n = 15. Thirteen percent of the Spanish-language sites had a score of 50 or higher. Average subscores for coverage, correctness, and currency were 28.9 percent, 23.0 percent, and 20.7 percent, respectively. The top five sites scored between 39 percent and 64 percent in their content coverage. The same qualitative association of coverage with correctness and currency scores as seen with the English-language sites was evident.
Table 4.
Average Content Scores for Spanish-language Web sites. Scores were Normalized to 0-100. Sites are Sorted by Highest Content Score. Calculation Errors due to Rounding
| Organization | Content score | % Coverage* | % Correctness* | % Currency* |
|---|---|---|---|---|
| American Society of Clinical Oncology | 56 | 64 | 49 | 43 |
| New York Online Access to Health | 54 | 52 | 55 | 54 |
| University of Utah | 40 | 46 | 38 | 26 |
| University of Virginia Health System | 36 | 43 | 29 | 29 |
| Elmundo.es | 31 | 39 | 26 | 15 |
| National Cancer Institute | 31 | 36 | 28 | 22 |
| NIDCR | 26 | 22 | 34 | 18 |
| Fisterra.com | 24 | 22 | 26 | 22 |
| American Cancer Society | 17 | 17 | 17 | 18 |
| Odontologia-online | 14 | 17 | 10 | 14 |
| Oral Cancer Consortium | 12 | 14 | 11 | 10 |
| Webodontologica.com | 12 | 16 | 8 | 4 |
| National Coalition for Cancer Survivorship | 10 | 16 | 5 | 4 |
| University of Texas Anderson Cancer Center | 10 | 11 | 10 | 10 |
| American Dental Association | 10 | 20 | 0 | 43 |
| Averages | 25.6 | 28.9 | 23.0 | 20.7 |
Weighted.
Direct statistical comparisons between the English-language and Spanish-language content scores were, in our opinion, not meaningful because of the early stage of development of the instrument and rating process. However, it is useful to break down the content scores by category and make some qualitative assessments (see Figures 1 and 2).
Figure 1.
Average content category scores for English-language Web sites, for the five sites with the highest overall content score, and for the five sites with the lowest overall content score. Scores were normalized to 0-100 and sorted by average scores.
Figure 2.
Average content category scores for Spanish-language Web sites, for the five sites with the highest overall content score, and for the five sites with the lowest overall content score. Scores were normalized to 0-100 and sorted by average scores.
Average content category scores for English-language sites ranged from 6 to 79. Two categories, Clinical presentation and Etiology and risk factors, scored well above the average content score of 52.1. Most of the remaining content categories (all except Images) fell into a relatively narrow score range well below the average score, from 18 to 30. The proportional distance of scores for the top five and bottom five sites to the average score tended to increase as category scores decreased. For instance, while top five and bottom five scores for the two highest-scoring categories cluster relatively closely around the average, categories such as Follow-up care after treatment, Treatment, Support groups, Nutrition during treatment, Information on clinical trials, and Rehabilitation show large distances to the average score.
For Spanish-language sites, average content category scores ranged from 4 to 50. Clinical presentation and Etiology and risk factors, as in the English-language sites, scored highest. A similar pattern regarding the drop-off as well as the distribution of scores for other categories could be observed. Information categories with significant divergence of top five and bottom five site scores from the average included Support groups, Nutrition during treatment, Information on clinical trials, Rehabilitation, and Images.
An analysis using Pearson's correlation coefficient showed no correlation between the surface evaluation and content scores of English-language sites, r(23) = −0.25, P = 0.651. The same was true for the Spanish-language sites, r(15) = 0.069, P = 0.807.
Discussion
In this study, we developed an initial approach to evaluate surface quality and content of English- and Spanish-language Web sites about oral cancer, identified sites most likely frequented by consumers and patients, and conducted a pilot evaluation using our method. We now turn to our third objective, discussing the public health implications of our study.
Several studies (26–28) suggest that the Internet can be a useful source of health information, and assist patients and providers with clinical decision making. Because the Internet reaches a large part of the population today and individuals increasingly turn to it for health information, it plays an increasingly important role in public health (1). Several studies have found significant variation with regard to information on the Web about cancer (4,22) and other diseases, such as cleft lip and palate (29). Our study is no exception.
Several findings in our study are relevant to dental public health. First, we have produced a current in-depth quality assessment of major oral cancer Web sites that can guide patients and providers to useful information. Clinicians could recommend the sites that scored high on surface and/or content to patients and other individuals looking for oral cancer information. Moreover, such “information therapy” (30) could also extend to selected aspects of oral cancer, such as treatment, for which we produced subscores. Detailed ratings for all reviewed sites are available at http://www.dentalinformatics.org/tools/oralcancer to help clinicians and patients identify appropriate sites for their information needs.
Second, oral cancer Web sites appear to focus their content primarily on the clinical presentation, and etiology, and risk factors of the disease. Other aspects, such as prevention, treatment, and rehabilitation, receive relatively little attention, and variability of information coverage is quite high among sites. As a result, the sites may serve patients' information needs relatively well regarding what causes the disease and how it manifests itself, but not as well regarding other aspects. This may leave patients with information deficits that must be remedied by other means. A potential reason for the imbalance in content presentation may be the absence or dearth of basic information or research on various aspects of the disease. This hypothesis may be supported by our finding that not many sites provided references to the literature [similar findings regarding references were made by other studies (4,22)]. Given the fact that our content evaluation template was derived from two authoritative textbooks on oral cancer and validated by three oral cancer experts, it may be worth-while to review – from a policy perspective – where deficits in research on oral cancer exist.
Third, while we can not compare the scores for English- and Spanish-language sites in a statistically rigorous fashion, a qualitative analysis suggests that the Web serves English speakers much better than Spanish speakers regarding oral cancer. Both surface and content scores for English-language Web sites were consistently and significantly higher than Spanish-language ones. In addition, we had to remove 25 percent of the Spanish-language sites from the content evaluation because they were only available intermittently. Thus, the Web may put Hispanic populations, whose male portion is already suffering from higher oral cancer rates than whites (31), at a disadvantage regarding oral cancer information, possibly aggravating existing health disparities (32).
Fourth, even though we did not find any significant correlation between the two aspects of the evaluation, it is interesting to note that sites that scored high on surface quality did not necessarily score high on content and vice versa. For both languages, the top five sites in content and the top five sites in surface quality had no overlap. In fact, for both English- and Spanish-languages sites, at least two sites in each of the top five lists (content and surface quality) appeared in the bottom half of the other. This demonstrates that good design of a site may not indicate quality content, or vice versa.
A last finding relevant to dental public health was that not many sites provided information on clinical trials. Given the challenges of enrolling representative groups of patients in cancer clinical trials (33), the dearth of information about clinical trials on oral cancer Web sites could be a concern.
This study has several limitations. First, we conducted only a preliminary evaluation using the initial version of a novel assessment tool with a limited number of raters. While our study showed that the tool can be used to evaluate oral cancer Web sites, no statements about the reliability, validity, or generalizability of the results can be made. Second, our study provides only a snapshot in time of information represented in a rapidly changing medium. We expect that changes to the Web sites that we evaluated would already alter some of our findings today. Third, cultural differences between English- and Spanish-language groups may influence the way information is presented on Web sites. Therefore, a tool designed to evaluate English-language Web site quality may bias results toward English cultural norms. Last, our pilot content evaluation tool covered many topics, with a high score only attainable by covering most or all topics. However, certain sites may not have been focused on covering oral cancer topics comprehensively, and thus scored lower. To alleviate misunderstandings, site developers should clearly indicate the goals of their site.
In future research, we plan to refine our tool through additional studies and conduct a formal assessment of reliability and validity. We hope that this study sparks interest among other researchers to evaluate, extend, and enhance our tool, and address quality issues of oral cancer information on the Web. In addition, we will be developing strategies to communicate our findings effectively to site developers and to integrate evaluation approaches such as ours into a continuous quality improvement cycle for oral health information in the Web.
Acknowledgments
This project was supported in part by grant 5T15LM007059-19 from the National Library of Medicine/National Institute of Dental and Craniofacial Research, grant 1KL2RR024154-03 from the National Center for Research Resource and grant 1U54DE14257-01 from the National Institute of Dental and Craniofacial Research. We thank those who generously donated their time to assist in the testing of our content evaluation tool, and Humberto Torres-Urquidy, Rebecca Crowley, and Ralph Katz for their input during the writing of the manuscript, as well as Michael Dziabiak's help with formatting.
Footnotes
Conflict of interest: none.
References
- 1.Bennett GG, Glasgow RE. The delivery of public health interventions via the Internet: actualizing their potential. Annu Rev Public Health. 2009 Apr 29;30:273–92. doi: 10.1146/annurev.publhealth.031308.100235. [DOI] [PubMed] [Google Scholar]
- 2.Fox S, Rainie L. The online health care revolution: How the web helps Americans take better care of themselves. [Online]. 2000 [updated 2000 Nov 26]; [cited 2009 Apr 20]. Available from: http://www.pewinternet.org/reports/pdfs/PIP_Health_Report.pdf.
- 3.Eysenbach G, Powell J, Kuss O, Sa ER. Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review. JAMA. 2002;287(20):2691–700. doi: 10.1001/jama.287.20.2691. [DOI] [PubMed] [Google Scholar]
- 4.Biermann JS, Golladay GJ, Greenfield ML, Baker LH. Evaluation of cancer information on the Internet. Cancer. 1999;86(3):381–90. [PubMed] [Google Scholar]
- 5.Eysenbach G, Diepgen TL. Towards quality management of medical information on the internet: evaluation, labelling, and filtering of information. BMJ. 1998;317(7171):1496–500. doi: 10.1136/bmj.317.7171.1496. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Floridi L. Brave.Net.World: the Internet as a disinformation superhighway? Electron Libr. 1996;14(6):509–14. [Google Scholar]
- 7.Health on the Net F Health on the net code of conduct (HONCode) [Online]. 1998 [updated 1998/07/15/]; [cited 2009 Apr 20]. Available from: http://www.hon.ch/Conduct.html.
- 8.Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: caveant lector et viewor – Let the reader and viewer beware. JAMA. 1997;277(15):1244–5. [PubMed] [Google Scholar]
- 9.MedicalMatrix LLC MedicalMatrix: ranked, peer-reviewed, annotated, updated clinical medicine resources. [Online]. 2006. [cited 2009 Apr 20]. Available from: http://www.medmatrix.org Archived at: http://www.webcitation.org/5gwCMoByW.
- 10.Jadad AR, Gagliardi A. Rating health information on the Internet: navigating to knowledge or to Babel? JAMA. 1998;279(8):611–14. doi: 10.1001/jama.279.8.611. [DOI] [PubMed] [Google Scholar]
- 11.Eysenbach G, Kohler C. How do consumers search for and appraise health information on the world wide web? Qualitative study using focus groups, usability tests, and in-depth interviews. BMJ. 2002;324(7337):573–7. doi: 10.1136/bmj.324.7337.573. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Jemal A, Siegel R, Ward E, Hao Y, Xu J, Thun MJ. Cancer statistics, 2009. CA Cancer J Clin. 2009;59(4):225–49. doi: 10.3322/caac.20006. [DOI] [PubMed] [Google Scholar]
- 13.Howe HL, Wu X, Ries LA, Cokkinides V, Ahmed F, Jemal A, Miller B, Williams M, Ward E, Wingo PA, Ramirez A, Edwards BK. Annual report to the nation on the status of cancer, 1975–2003, featuring cancer among U.S. Hispanic/Latino populations. Cancer. 2006;107(8):1711–42. doi: 10.1002/cncr.22193. [DOI] [PubMed] [Google Scholar]
- 14.Birru MS, Monaco VM, Charles L, Drew H, Njie V, Bierria T, Detlefsen E, Steinman RA. Internet usage by low-literacy adults seeking health information: an observational analysis. J Med Internet Res. 2004;6(3):e25. doi: 10.2196/jmir.6.3.e25. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Leithner A, Maurer-Ertl W, Glehr M, Friesenbichler J, Leithner K, Windhager R. Wikipedia and osteosarcoma: a trustworthy patients' information? J Am Med Inform Assoc. 2010 Jul-Aug;17(4):373–4. doi: 10.1136/jamia.2010.004507. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Medline Plus About MedlinePlus. National Library of Medicine. 2008 [cited 2009 Apr 20]. Available from: http://www.nlm.nih.gov/medlineplus/aboutmedlineplus.html.
- 17.MedlinePlus MedlinePlus survey results 2005: national library of medicine. 2006 [cited 2009 Apr 20]. Available from: http://www.nlm.nih.gov/medlineplus/survey2005/index.html.
- 18.Systems M. Information quality tool. 2001 [cited 2009 Apr 20]. Available from: http://hitiweb.mitretek.org/iq/default.asp Archived at: http://www.webcitation.org/5BgjEOh30.
- 19.Assessing the quality of Internet health information: summary. AHRQ: Agency for Healthcare Research and Quality. 1999 [cited 2009 Apr 20]. Available from: http://www.ahrq.gov/data/infoqual.htm Archived at: http://www.webcitation.org/5gwGCiYgO.
- 20.W3 Link Checker. World wide web consortium (W3C) 2009 [cited 2009 Apr 20]. Available from: http://validator.w3.org/checklink Archived at: http://www.webcitation.org/5gwGPZ8fA.
- 21.CAST, Inc Bobby. 2009 [cited 2009 Apr 20]. Available from: http://www.cast.org/products/Bobby/index.html Archived at: http://www.webcitation.org/5gwKx046a.
- 22.Black PC, Penson DF. Prostate cancer on the Internet – information or misinformation? J Urol. 2006;175(5):1836–42. doi: 10.1016/S0022-5347(05)00996-1. discussion 42. [DOI] [PubMed] [Google Scholar]
- 23.Ord R, Blanchaert R, editors. Oral cancer: the dentist's role in diagnosis, management, rehabilitation, and prevention. Quintessence; Chicago: 2000. [Google Scholar]
- 24.Decker B, editor. Oral cancer. 4th ed Hamilton, ON: 1998. Society AC. [Google Scholar]
- 25.Friedman CP, Wyatt JC, Shortliffe EH. Evaluation methods in medical informatics. Springer; New York, NY: 2006. [Google Scholar]
- 26.Ream E, Blows E, Scanlon K, Richardson A. An investigation of the quality of breast cancer information provided on the internet by voluntary organisations in Great Britain. Patient Educ Couns. 2009;76(1):10–15. doi: 10.1016/j.pec.2008.11.019. [DOI] [PubMed] [Google Scholar]
- 27.Wald HS, Dube CE, Anthony DC. Untangling the Web – the impact of Internet use on health care and the physician–patient relationship. Patient Educ Couns. 2007;68(3):218–24. doi: 10.1016/j.pec.2007.05.016. [DOI] [PubMed] [Google Scholar]
- 28.Winker MA, Flanagin A, Chi-Lum B, White J, Andrews K, Kennett RL, DeAngelis CD, Musacchio RA. Guidelines for medical and health information sites on the internet: principles governing AMA web sites. American Medical Association. JAMA. 2000;283(12):1600–6. doi: 10.1001/jama.283.12.1600. [DOI] [PubMed] [Google Scholar]
- 29.Antonarakis GS, Kiliaridis S. Internet-derived information on cleft lip and palate for families with affected children. Cleft Palate Craniofac J. 2009;46(1):75–80. doi: 10.1597/07-206.1. [DOI] [PubMed] [Google Scholar]
- 30.Mitchell DJ. Toward a definition of information therapy. Proc Annu Symp Comput Appl Med Care. 1994;71:5. [PMC free article] [PubMed] [Google Scholar]
- 31.Oral cancer incidence (New Cases) by age, race, and gender. National Institute of Dental and Craniofacial Research; 2010. [cited 2009 Apr 20]. Available from: http://www.nidcr.nih.gov/DataStatistics/FindDataByTopic/OralCancer/OralCancerIncidence.htm. [Google Scholar]
- 32.Kressin NR. Racial/Ethnic disparities in health care: lessons from medicine for dentistry. J Dent Educ. 2005;69(9):998–1002. [PubMed] [Google Scholar]
- 33.Murthy VH, Krumholz HM, Gross CP. Participation in cancer clinical trials: race-, sex-, and age-based disparities. JAMA. 2004;291(22):2720–6. doi: 10.1001/jama.291.22.2720. [DOI] [PubMed] [Google Scholar]


