Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 Jun 1.
Published in final edited form as: J Dent Educ. 2015 Jun;79(6):686–696.

Does Use of an Electronic Health Record with Dental Diagnostic System Terminology Promote Dental Students’ Critical Thinking?

Susan G Reed 1, Shawn S Adibi 2, Mullen Coover 3, Robert G Gellin 4, Amy E Wahlquist 5, Anitha AbdulRahiman 6, Lindsey H Hamil 7, Muhammad F Walji 8, Paula O’Neill 9, Elsbeth Kalenderian 10
PMCID: PMC4593405  NIHMSID: NIHMS723440  PMID: 26034034

Abstract

The Consortium for Oral Health Research and Informatics (COHRI) is leading the way in use of the Dental Diagnostic System (DDS) terminology in the axiUm electronic health record (EHR). This collaborative pilot study had two aims: 1) to investigate whether use of the DDS terms positively impacted predoctoral dental students’ critical thinking skills measured by the Health Sciences Reasoning Test (HSRT), and 2) to refine study protocols. The study design was a natural experiment with cross-sectional data collection using the HSRT for 15 classes (2013–17) of students at three dental schools. Characteristics of students who had been exposed to the DDS terms were compared with students who had not, and the differences were tested by t-tests or chi-square tests. Generalized linear models were used to evaluate the relationship between exposure and outcome on the overall critical thinking score. The results showed that exposure was significantly related to overall score (p=0.01), with not-exposed students having lower mean overall scores. This study thus demonstrated a positive impact of using the DDS terminology in an EHR on the critical thinking skills of predoctoral dental students in three COHRI schools as measured by their overall score on the HSRT. These preliminary findings support future research to further evaluate a proposed model of critical thinking in clinical dentistry.

Keywords: dental education, clinical education, electronic health record, diagnostic terminology, critical thinking


This report describes the findings from a collaborative pilot study on the impact of using the Dental Diagnostic System (DDS) terminology (formerly known as EZCodes) in an electronic health record on the critical thinking skills of predoctoral dental students. There are many benefits to using a diagnostic terminology, including facilitating epidemiologic research, communication, data sharing across providers and facilities, development of clinical outcomes measures, and development of diagnostic skills for students and faculty members.15 The investigators and dental students in this study were from universities affiliated with the Consortium for Oral Health Research and Informatics (COHRI).6 The academic dental institutions of COHRI share a common interest in and/use of the axiUm electronic health record (axiUm, Exan Corporation, Vancouver, Canada). In addition to being the electronic health record (EHR) for many dental school patients, this EHR is comprised of modules that incorporate and facilitate clinical training for dental students via a series of interactive computer screens and electronic templates.

Recently, and for the first time in dentistry, a dental diagnostic terminology was developed and incorporated into the axiUm EHR as DDS terms.3,7,8 COHRI is leading the way for introduction of the DDS terminology into the treatment planning module of the axiUm EHR. In 2012, three dental schools were using the DDS terminology, and additional schools were preparing to introduce it to their students. Because of this evolving natural experiment of some schools using and some planning to use the DDS terms, we had the opportunity to pilot a study of their impact on students.

In spring of 2013, when this study was conducted, each of the three dental schools had up to three years of experience using the DDS terms, and each class had either no experience or some experience with using them. One aim of this pilot study was to investigate whether use of the EHR with the DDS terms positively impacted dental students’ critical thinking. We sought to determine this impact by comparing critical thinking scores on the Health Sciences Reasoning Test of dental students from classes exposed to DDS terms with those who had no exposure. We defined students not exposed to DDS terms as those with minimal exposure to diagnosis, differential diagnosis, and assignment of diagnosis as the steps preceding preparation of treatment plan options. Our hypothesis was that the dental students who were exposed to using the DDS terms would have a higher average overall score compared to dental students who were not exposed. A second major aim of the study was to use the experience to refine our measures and protocols for further study of the DDS and critical thinking skills of dental students.

DDS Terms and Critical Thinking

Though the dental EHR development is ongoing, the construction and incorporation of DDS terms into the treatment planning process filled in the vital missing piece of diagnosis, which had not previously been a documented part of dental education and practice. Technically, the DDS terms were added to the Treatment Plans Tab of the axiUm EHR that enables students to select a diagnosis and provide documentation of that diagnosis. Use of the DDS terms highlights the use of diagnostic terminology as an integral step prior to treatment planning. Moreover, the DDS terms are an interface terminology,9 giving the user access to a practical, user-friendly terminology that is granular (e.g., it contains recurrent caries) as well as flexible: it contains entire complex terms (e.g., localized chronic moderate periodontitis) so the user does not need to build a term by clicking on various individual concepts. Formal terminologies such as ICD10 and SNODENT11 are less flexible for use at chair-side, mainly because they were developed for different reasons. ICD, a classification mostly used for billing, does not contain enough oral health terms. SNODENT is an extensive ontology that contains more than 7,700 oral health terms and relationships among them and is now integrated into SNOMED.12 As a reference terminology, SNOMED contains more than one million terms and relationships, in which the oral health terms are not separated. As such, using an interface terminology like the DDS, which uses SNOMED only as a reference (like a dictionary), is much more practical for dentists and dental educators who use the EHR chair-side every day. With use of the DDS in axiUm, diagnosis is now included as a fundamental and documented step in the clinical workflow from patient examination to treatment in the dental school patient EHR.

We believe that the sequence of clinical steps beginning with examination through diagnosis to treatment uses the skills of critical thinking. To measure critical thinking skills in our study, we chose the Health Sciences Reasoning Test (HSRT).13 Our theoretical model of the proposed relationship is outlined in Figure 1. The shaded boxes represent the five skills of critical thinking assessed by the HSRT scale scores. These shaded boxes are placed between or at the rise of each step from Examination to Treatment, illustrating the critical thinking skill associated with that step. The specific axiUm EHR tabs (or resource) are listed to the right of each shaded box. The patient enters through the Initial Examination, and the direction of the sequence of steps from Examination to Treatment is indicated by arrows to the left of the shaded boxes. When Treatment is completed, the patient re-enters through the Periodic Examination. In clinical dental practice, this model is utilized repeatedly for a patient during the course of her or his lifetime.

Figure 1.

Figure 1

Proposed model of critical thinking steps in clinical dentistry with process associated with critical thinking skills (in shaded boxes) and axiUm electronic health record (EHR) tabs

Note: Patient enters at top left arrow.

The sequenced templates in the axiUm EHR provide the framework for progression from examination to dental treatment that is reasoned from the evidence. This series of steps from examination to treatment is guided by using the ordered modules of the EHR. With repeated use of these sequenced steps, both within and among patients, we hypothesize that critical thinking skills can be practiced and strengthened.

Critical Thinking

Critical thinking has been defined as “the ability to think critically” and “involves three things: 1) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one’s experiences, 2) knowledge of the methods of logical inquiry and reasoning, and 3) some skill in applying those methods.”14 The American Philosophical Association’s (APA) multiyear Delphi study resulted in a description that included both critical thinking skills and the disposition of the critical thinker.15 The APA Delphi Consensus Statement regarding critical thinking and the ideal critical thinker noted: “We understand critical thinking to be purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based. [Critical thinking] is essential as a tool of inquiry…. The ideal critical thinker is habitually inquisitive, well informed, trustful of reason, open-minded, flexible, fair-minded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused in inquiry, and persistent in seeking results that are as precise as the subject and the circumstances of inquiry permit.”

One description of how critical thinking relates to clinical judgment is that “critical thinking is the process we use to make a judgment about what to believe and what to do about the symptoms our patient is presenting for diagnosis and treatment.”16 Critical thinking is important to dentistry as documented in the Commission on Dental Accreditation (CODA)’s Accreditation Standards for Dental Education Programs.17 In August 2010, the American Dental Association (ADA) adopted a resolution that dental school graduates must be competent in the use of critical thinking. In the accreditation standards, critical thinking is both one of the principles and also part of Standard 2, Educational Program. Also, there is the intent that the “educational program should use teaching and learning methods that support the development of critical thinking and problem-solving skills.”17

In this study, we hypothesized that the EHR would support critical thinking by presenting or providing the framework and sequence of steps that can involve critical thinking. At each step of the sequence, options are available in the pull-down menu, and a student’s selection directs future options and choices. Making differential diagnoses, and ultimately assigning a diagnostic term, is a crucial new step that involves critical thinking. We consider that the dental student uses critical thinking skills as the way to progress from one step to the next.

Measuring Critical Thinking with the HSRT

The validated HSRT instrument is designed to measure critical thinking skills and is specifically calibrated for trainees in health sciences educational programs and for professional health science practitioners in all settings.13 The HSRT was developed from the findings of the APA Delphi study that resulted in descriptions of both critical thinking skills and the disposition of the critical thinker.15 The HSRT measures “high-stakes reasoning and decision-making processes.”13

Test items range in difficulty and complexity and provide a measure of overall critical thinking as well as measures of five separate skills of critical thinking. The response frame is 33 items in a multiple-choice format. These items are from the item pool of the California Critical Thinking Skills Test (CCTST).18 This item pool has evolved over the past 40 years with extensive national and international testing. The HSRT is available in eight languages and has percentile norms for specific test populations including dentistry. The graduate dentistry norms were individuals tested at graduate dental programs (admitted but not yet graduated) in the United States. The specific dentistry norms are relatively new (first issued in 2012) and are re-examined every 12 to 24 months.19

HSRT results are reported as an overall score and five scale scores. The overall score describes overall strength in reasoning to form reflective judgments about what to believe or what to do and predicts the “capacity for success in educational or workplace settings which demand reasoned decision making and thoughtful problem solving.”13 The overall score is not the sum of the five scale scores. The five scale scores on Evaluation, Analysis, Deduction, Reasoning, Inference, and Induction identify specific critical thinking skills. For definitions of the five skills in a popular dictionary compared with those in the HSRT test manual, see Table 1.

Table 1.

Definitions for five scales of critical thinking on Health Sciences Reasoning Test (HSRT)

Scale Free Dictionary HSRT Test Manual
Evaluation 1. To ascertain or fix the value or worth of;
2. to examine and judge carefully; appraise.
Describes the skills that enable assessment of the credibility of and claims by sources of information. By applying this skill one judges the “quality of analyses, interpretations, explanations, inferences, options, opinions, beliefs, ideas, proposals, and decisions.”
Analysis a. The separation of an intellectual or material whole into its constituent parts for individual study; b. the study of such constituent parts and their interrelationships in making up a whole. Analytical reasoning skills enable people to identify assumptions, reasons, and claims and to examine how they interact in the formation of arguments. People with strong analytical skills attend to patterns and to details.
Deduction In logic: a. the process of reasoning in which a conclusion follows necessarily from the stated premises; inference by reasoning from the general to the specific; b. a conclusion reached by this process. Decision making in precisely defined contexts; deductive reasoning moves with exacting precision from the assumed truth of a set of beliefs to a conclusion that cannot be false if those beliefs are true.
Inference a. The act or process of deriving logical conclusions from premises known or assumed to be true; b. the act of reasoning from factual knowledge or evidence. Inference skills enable us to draw conclusions from reasons and evidence. Inference skills indicate the necessary or the very probable consequences of a given set of facts and conditions.
Induction In logic: a. the process of deriving general principles from particular facts or instances; b. a conclusion reached by this process. Decision making in contexts of uncertainty relies on inductive reasoning. We use inductive reasoning skills when we draw inferences about what we think must probably be true based on analogies, case studies, prior experience, statistical analyses, simulations, hypotheticals, and familiar circumstances and patterns of behavior.

Sources: For Free Dictionary, see www.thefreedictionary.com; for HSRT Test Manual, see Health sciences reasoning test manual. San Jose, CA: California Academic Press, 2013:14.

Methods

The study was approved by the Institutional Review Board at each of the three institutions: Medical University of South Carolina, The University of Texas School of Dentistry at Houston, and Harvard School of Dental Medicine. The study design was a natural experiment with two groups for comparisons (those with some exposure and those not exposed to the DDS terms). The three dental schools involved represented a convenience sample of predoctoral dental students who had varying exposures to using the DDS terms. The dental students who constituted the study population were designated by year of graduation: Class of 2013 through Class of 2017. Based upon the school’s curriculum, these classes of students were assigned by the researchers as either exposed or not-exposed to the DDS terms. The newly matriculated Class of 2017 at each of the schools was included as a control group.

HSRT Administration and Data Management

Demographic questions were added to the HSRT to collect information on potential confounders of age, gender, English as a first language, race/ethnicity, and highest education level completed before dental school. For the Classes of 2013 through 2016, the HSRT was administered between March and June 2013; the newly matriculated classes (Class of 2017) took the same HSRT between June and September 2013.

Two schools administered the HSRT using Insight Assessment’s Online E-Testing software, while one school administered it using paper-and-pencil hard copy test booklets with the Insight Assessment CapScore response form.13 For the two schools using online testing, one school allowed scratch paper for students’ use during the test, and one did not. All data were entered via Insight Assessment into an administrator-accessible database. The dataset was exported to Microsoft Excel for additional analyses using SAS software, Version 9.3 (SAS Institute Inc., Cary, NC, USA) of the SAS system for Windows.

Data Analyses

Response rates were calculated overall and separately by class and by school using the total count of predoctoral dental students registered for each class as the respective denominator. Students who answered fewer than 60% of the 33 multiple-choice questions (n=4) and those who spent less than 15 minutes on test-taking (n=21) were excluded from further analyses because their records were considered invalid.13 Unless otherwise noted, the sample size used for analyses of the classes was N=361 (386 minus 25 who did not meet the inclusion criteria).

Characteristics of the students who were exposed and not-exposed were compared via t-tests and chi-square tests, as appropriate. To test the major hypothesis, the exposed and not-exposed dental classes were compared for the outcome of overall critical thinking score. Generalized linear models were used to evaluate the relationship between exposure and overall critical thinking score as well as the five scale scores (Evaluation, Analysis, Deduction, Inference, and Induction). Each model included other covariates (potential confounders) that were individually removed from the model in a backwards stepwise approach until only statistically significant covariates remained in the models at a p<0.10 level (due to the preliminary nature of this study). To adjust for the effect of the school, two options were considered: option 1 used school as a covariate and not including covariates of option to take notes (notes variable) and online versus hard copy (test mode variable); option 2 excluded the school covariate and used the Class of 2017 mean score (for each outcome per school) as a covariate along with the notes and test mode variables. Because the ability to take notes and the test mode were consistent for a given school, these two variables uniquely identified school, so there was no need to also include school in option 2.

Other covariates included class year, gender, race/ethnicity, having a master’s degree prior to dental school, English as a first language, and age. We also included an interaction term between ethnicity and English as a first language as this was identified as a significant predictor in a previous study of dental students using the HSRT.20 A sensitivity analysis was performed using only data from the Classes of 2013–15 to evaluate the fit of the final models. This sensitivity analysis modeled each of the outcomes using general linear models and the covariates previously determined to be significant predictors (2013–16 data) to evaluate the stability of the beta/standard error estimates. Similar beta/standard error estimates would imply the model was relatively stable, even after removing a potentially naïve group that could be skewing the model estimates.

Results

The overall response rate of the dental students for Classes of 2013 through 2017 at the three dental schools was 39.3% (386/982). For Classes of 2013 through 2016, the overall response rate was 40% (309/772), and for the newly matriculated Class of 2017, the response rate was 36.6% (77/210). The overall response rates by dental school were 24.0% (44/183), 41.4% (189/457), and 44.7% (153/342). Table 2 shows the exposure status and HSRT response rates for the 361 students by school and by class who were used for further analyses. Response rates for the Classes of 2013–16 by exposure were 36.0% (160/444) for the not-exposed and 38.4% (126/328) for the exposed. Response rate for the Class of 2017 was 35.7% (75/210).

Table 2.

Exposure status and Health Sciences Reasoning Test response rates for students by dental school and by class (N=361)

Dental School Class of 2017 Class of 2016 Class of 2015 Class of 2014 Class of 2013
1 Not-exposed Not-exposed Not-exposed Not-exposed Not-exposed
47/74 (63.5%) 17/71 (23.9%) 42/70 (60.0%) 39/71 (54.9%) 7/56 (12.5%)
2 Not-exposed Not-exposed Not-exposed Exposed Exposed
10/36 (27.8%) 6/36 (16.7%) 10/37 (27.0%) 11/37 (29.7%) 4/37 (10.8%)
3 Not-exposed Not-exposed Exposed Exposed Exposed
18/100 (18.0%) 39/103 (37.9%) 44/84 (52.4%) 39/84 (46.4%) 28/86 (32.6%)

Participants from the Classes of 2013–17 were 48% female, and the mean (± standard deviation) age for the graduating Class of 2013 was 27.1±2.5 years and 23.8±3.2 years for the incoming Class of 2017. Overall, the population was 63% White, 20% Asian, 10% Hispanic, and 7% Black/American Indian/other (with 14 missing values). English was the first language for 83% of these students (with seven missing values). The highest education level achieved before entering dental school was 92% with bachelor’s degrees, 6% with master’s degrees, and 2% with either an associate or doctoral degree (with seven missing values). We divided the highest education level into having a master’s degree versus not having a master’s degree (bachelor’s or associate or doctorate). This was done to control for potential confounding from the master’s programs designed to prepare students for dental school. There was a general trend for a higher mean overall score for each class for 2013 through 2016, with an overall mean score difference between the Class of 2013 and Class of 2016 of 1.67 points (data not shown).

We also compared the exposed and not-exposed students for the Classes of 2013–16 (Table 3). For these classes, there was no statistically significant difference between the exposed and not-exposed groups by gender. As would be expected, there was a difference in age between the two groups, for the students who were not-exposed were more likely to be in the beginning years of the four-year program. There were differences between the exposed and not-exposed groups by race/ethnicity, English as a first language, and whether the students had a master’s degree. Variables “Test Mode” and “Notes” in Table 3 provide evidence of a significant school effect that is not accounted for in the comparison of exposed to not-exposed.

Table 3.

Characteristics of participating students by exposure status (Classes of 2013–16)

Variable Exposed
n=126
Not-Exposed
n=160
p-value
Gender (number) >0.9
  Female 61 77
  Male 63 80
Age (mean±SD) <0.0001
  Years 26.7±3.4 25.1±2.7
Race/ethnicity (number) <0.0001
  Asian 37 21
  Hispanic 19 9
  Black, Am. Indian, other 5 14
  White 61 108
English 1st language (number) 0.04
  Yes 95 130
  No 31 23
Highest education (number) 0.3
  Master’s degree 9 7
  Not having master’s 117 149
Test mode (number) <0.0001
  Online 126 55
  Hard copy 0 105
Notes (number) <0.0001
  Yes 15 121
  No 111 39
HSRT score (mean±SD)
  Overall 20.0±4.1 22.6±4.8 <0.0001
  Evaluation 4.7±1.3 5.0±1.2 0.1
  Analysis 4.3±1.2 4.5±1.2 0.2
  Deduction 5.9±2.0 7.2±2.3 <0.0001
  Inference 2.6±1.0 3.7±1.4 <0.0001
  Induction 7.3±1.7 7.7±1.7 0.1

Generalized linear models were used to evaluate the relationship between exposure and overall score for the HSRT. Table 4 shows the results of using school as a covariate (and not including notes or test mode variables). Table 5 shows the results when excluding school as a covariate and using the mean score for the Class of 2017 (per school) as a covariate in each model along with the notes. In general, the dental students who were not-exposed scored lower than the students who were exposed. After adjusting for other significant covariates, exposure was significantly related to overall score (p=0.01) with those who were not-exposed having lower mean overall scores than those who were exposed.

Table 4.

Students’ overall HSRT score with school as covariate

Classes of 2013–16 Classes of 2013–15

Variable Beta SE p-value Beta SE p-value
Exposure Not exposed −1.64 0.66 0.01 −2.73 1.55 0.08
Ethnicity Asian −2.57 0.86 >0.9 −3.19 0.86 0.4
Hispanic −1.47 0.96 −1.21 1.03
Other −0.34 1.12 1.32 1.27
Master’s No master’s 2.16 1.07 0.04 3.07 1.22 0.01
Language Non-English −4.02 1.34 0.003 −2.29 1.37 0.1
School 2 3.75 0.83 <0.0001 3.89 1.03 <0.0001
1 5.46 0.74 6.87 1.60
Ethnicity*English 0.05 0.0004

Note: 2013–15 comparisons were based on sensitivity analyses.

Table 5.

Students’ overall score with mean of Class of 2017 and notes as covariates

Classes of 2013–16 Classes of 2013–15

Variable Beta SE p-value Beta SE p-value
Exposure Not exposed −1.64 0.66 0.01 −2.73 1.55 0.08
Ethnicity Asian −2.57 0.86 >0.9 −3.19 0.86 0.4
Hispanic −1.47 0.96 −1.21 1.03
Other −0.34 1.12 1.32 1.27
Master’s No master’s 2.16 1.07 0.04 3.07 1.22 0.01
Language Non-English −4.02 1.34 0.003 −2.29 1.37 0.001
Class of 2017 mean −6.51 3.50 0.06 −11.39 4.73 0.02
Notes Not available −21.90 9.20 0.02 −35.67 13.22 0.008
Ethnicity*English 0.05 0.0004

Note: 2013–15 comparisons were based on sensitivity analyses.

Table 6 shows the results for significance for models of the overall and five scale scores for option 2 (similar to Table 5). Though the beta estimates are not presented, after adjusting for other covariates, exposure was significantly related to the Induction scale score (p<0.01), with those who were not-exposed having lower mean induction scale scores than those who were exposed. The relationships between exposure and the following scales were not significant: Evaluation scale score (p>0.9), Analyses scale score (p=0.3), and Deduction scale score (p=0.07). Though the relationship between exposure and the scale scores of Evaluation, Analyses, and Deduction were not significant, those exposed had higher mean scale scores than the not-exposed (data not shown) and were in the same direction as the overall and Induction scores.

Table 6.

Overall significance (p-values) from all models using mean of Class of 2017 and notes as covariates

Classes of 2013–16

Variable Level Reference Overall Evaluation Analyses Deduction Inference Induction
Exposure Not-exposed Exposed 0.01 >0.9 0.3 0.07 0.08 <0.01
Ethnicity White >0.9 n/a n/a 0.8 0.06 <0.01
Master’s No master’s Master’s 0.04 n/a n/a n/a n/a n/a
Language Non-English English <0.01 n/a 0.02 0.01 n/a n/a
Class 2016 n/a 0.06 n/a n/a n/a n/a
Class of 2017 mean 0.06 >0.9 >0.9 0.02 <0.01 0.8
Notes No Yes 0.02 0.09 0.3 <0.01 <0.01 0.2
Age n/a 0.05 n/a n/a n/a n/a
Ethnicity*English 0.05 n/a n/a 0.07 n/a n/a

n/a=variable not included in final model

For the Inference scale scores (using either option for adjusting for school), after adjusting for other covariates, exposure was not significantly related (p=0.08). The Inference models were not robust when performing the sensitivity analysis, as the estimates for the exposure variable changed direction, so those results should be viewed with caution. For all sensitivity models in which exposure was significantly related to outcome, the beta estimates for the variables remained relatively stable (and in the same direction), indicating fairly robust estimates for these models. When exposure was not significantly related to the outcome, the beta estimates varied, and in the case of the Inference scale score, the estimate also changed direction. Options 1 and 2 for adjusting for school effect showed similar results.

Discussion

One purpose of this pilot study was to test our hypothesis that the use of the EHR with DDS terms would positively impact dental students’ critical thinking. Our finding was that the students who were exposed to using the DDS terminology in the treatment planning module of the axiUm EHR had a significantly higher mean HSRT overall score for critical thinking than the students who had not been exposed. To address whether these differences in scores have clinical significance, we relate our findings to a published report comparing novices and experts with the HSRT. That study was designed to demonstrate the construct validity of the HSRT, and those researchers’ major finding was a difference in the mean overall scores of 1.57 between novices and experts.21 The score difference we found between the not-exposed and exposed groups exceeded that finding; however, further study is needed before assigning clinical significance.

Our preliminary findings, though of great interest, must be tempered with limitations of the pilot study. Our anticipation of such limitations led us to the second, and perhaps more important, purpose of the study: to provide evidence to refine protocols for future study of the relationship among critical thinking skills, clinical dentistry from examination to treatment plan selection, and the EHR in dental education. A limitation to the level of evidence provided by this pilot study was the natural experiment design and our inability to compensate for the lack of random assignment of exposure.22 Our results were also affected by response bias as the student participation rates were lower than expected at all three dental schools. At the onset, we felt confident that we would achieve relatively high response rates with our varied invitation methodology and food and beverage incentives. However, we finished in the middle when compared to response rates in similar studies that ranged from a low of 22% for U.S. dental students20 to a high of 56% for U.K. dental students.23 A unique post-HSRT administration finding for us via Insight Assessment was that the students’ use of scratch paper or ability to make notes during test-taking could significantly impact one’s score. At that time we had no choice but to control for this in the analyses. Future studies can preclude this situation by providing equal opportunity for the students to make notes during test-taking.

There are two additional sources of confounding we likewise had to control for in the analyses. The first was students’ exposure to critical thinking skills both before and during dental school. As would be expected, we found that the overall and scale scores did reflect the year of schooling. There was the general trend for a higher mean overall score from the Classes of 2016 to 2013. The second source was the students’ exposure, both before and during dental school, to diagnosis methodology and especially assigning a diagnosis. We controlled for these potentially confounding effects in the multivariable analyses by using the dental school and then the Class of 2017 (newly matriculated) as covariates. In both of those analyses, the dental students exposed to use of the DDS terminology had significantly higher overall scores (Tables 4 and 5). We also used different cohort classes (2013–16 and 2013–15) to show the stability of the models when excluding the potential naïve Class of 2016. We did not, however, control for variability in the amount of exposure with use of the DDS terms. All students in a given class at a particular school were assigned as exposed or not-exposed. Exposure or not to the use of DDS terminology was assumed to be equal for all students in a class based upon whether or not the DDS terminology was turned on for use in the EHR. In ad hoc discussion with some of the exposed students, variations in the amount of individual student exposure time and use of the DDS terminology in an exposed class became apparent.

Strengths of this pilot study included the cooperation and administrative support from the three dental schools. Another strength was the demonstrated ease of administration of the HSRT either in hard copy or online. There is the opportunity for dental schools to administer the HSRT at the class or at the individual level on a periodic basis. An additional strength of this pilot study is the timeliness of the topic, with critical thinking now an evaluated item for dental school accreditation. More and more dental students will soon be using the treatment planning module with DDS terminology in the axiUm EHR. These sequenced modules provide a structure for the development of students’ diagnostic abilities. As shown in our proposed model of critical thinking in clinical dentistry with process-associated critical thinking skills, the HSRT may be a useful metric to monitor the growth and development of specific critical thinking skills for dentistry.

Conclusion

Our collaborative pilot study demonstrated a positive impact of using the DDS terminology in the axiUm electronic health record on the critical thinking skills of predoctoral dental students in three COHRI schools as measured by overall scores on the HSRT. These preliminary findings support future research to further evaluate our proposed model of critical thinking in clinical dentistry. Lessons learned from this pilot study will also strengthen future research. Our suggestions include the following: 1) to utilize an experimental design with longitudinal data collection to evaluate the impact on critical thinking skills by using the DDS terminology in the axiUm EHR; 2) to change the HSRT administration to mandatory rather than voluntary as a way to improve response rates for greater representativeness of the results; 3) to provide scratch paper to all students during HSRT administration, regardless of mode of test (hard copy or online); 4) to refine (both qualitative and quantitative) evaluation of an individual student’s exposure to critical thinking skills prior to and during dental school; and 5) to refine (both qualitative and quantitative) evaluation of students’ exposure to diagnostic methodology and DDS terminology prior to and during dental school.

Acknowledgments

Funding in part for this study was provided by a grant from the ADEAGies Foundation. This publication was supported by the South Carolina Clinical & Translational Research (SCTR) Institute, with an academic home at the Medical University of South Carolina, through NIH Grant Number UL1 TR000062.

Contributor Information

Susan G. Reed, Department of Pediatrics-Neonatology, College of Medicine, and Department of Stomatology, James B. Edwards College of Dental Medicine, Medical University of South Carolina.

Shawn S. Adibi, Department of General Practice and Dental Public Health, The University of Texas School of Dentistry at Houston.

Mullen Coover, Department of Oral Rehabilitation, James B. Edwards College of Dental Medicine, Medical University of South Carolina.

Robert G. Gellin, Department of Stomatology, James B. Edwards College of Dental Medicine, Medical University of South Carolina.

Amy E. Wahlquist, Department of Public Health Sciences, Medical University of South Carolina.

Anitha AbdulRahiman, Department of Oral Health Policy and Epidemiology and student, Harvard School of Dental Medicine at the time of this study.

Lindsey H. Hamil, Department of Stomatology, James B. Edwards College of Dental Medicine, Medical University of South Carolina.

Muhammad F. Walji, The University of Texas School of Dentistry at Houston.

Paula O’Neill, The University of Texas School of Dentistry at Houston.

Elsbeth Kalenderian, Department of Oral Health Policy and Epidemiology, Harvard School of Dental Medicine.

REFERENCES

  • 1.Leake JL. Diagnostic codes in dentistry: definition, utility, and developments to date. J Can Dent Assoc. 2002;68(7):403–406. [PubMed] [Google Scholar]
  • 2.O’Malley KJ, Cook KF, Price MD, et al. Measuring diagnoses: ICD code accuracy. Health Serv Res. 2005;40(5 Pt 2):1620–1639. doi: 10.1111/j.1475-6773.2005.00444.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Kalenderian E, Ramoni RB, White JM, et al. The importance of using diagnostic codes. Oral Surg Oral Med Oral Pathol Oral Radiol Endod. 2011;112(1):4–5. doi: 10.1016/j.tripleo.2011.01.047. author reply 5. [DOI] [PubMed] [Google Scholar]
  • 4.Leonard M, Bonacum D, Graham S. SBAR technique for communication: a situational briefing model. [Accessed 13 Oct. 2014];Kaiser Permanente of Colorado, Institute for Healthcare Improvement. At: www.ihi.org/resources/Pages/Tools/SBARTechniqueforCommunicationASituationalBriefingModel.aspx. [Google Scholar]
  • 5.Agency for Healthcare Research and Quality. [Accessed 22 Apr. 2013]; doi: 10.1080/15360280802537332. At: www.ahrq.gov. [DOI] [PubMed]
  • 6.Stark PC, Kalenderian E, White JM, et al. Consortium for oral health-related informatics: improving dental research, education, and treatment. J Dent Educ. 2010;74(10):1051–1065. [PMC free article] [PubMed] [Google Scholar]
  • 7.White JM, Kalenderian E, Stark PC, et al. Evaluating a dental diagnostic terminology in an electronic health record. J Dent Educ. 2011;75(5):605–615. [PMC free article] [PubMed] [Google Scholar]
  • 8.Kalenderian E, Ramoni RL, White JM, et al. The development of a dental diagnostic terminology. J Dent Educ. 2011;75(1):68–76. [PMC free article] [PubMed] [Google Scholar]
  • 9.Rosenbloom ST, Miller RA, Johnson KB, et al. Interface terminologies: facilitating direct entry of clinical data into electronic health record systems. JAMA. 2006;13(3):277–288. doi: 10.1197/jamia.M1957. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.World Health Organization. International classification of diseases (ICD) [Accessed 27 March 2013]; At: www.who.int/classifications/icd/en/.
  • 11.American Dental Association. [Accessed 7 Oct. 2014];SNODENT value and benefit: SNODENT systematized nomenclature of dentistry. At: www.ada.org/en/member-center/member-benefits/practice-resources/dental-informatics/snodent/snodent-value-and-benefits. [Google Scholar]
  • 12.International Health Terminology Standards Development Organization. SNOMED CT. [Accessed 30 Apr. 2013]; At: www.ihtsdo.org/snomed-ct/ [Google Scholar]
  • 13.Health sciences reasoning test manual. San Jose, CA: California Academic Press; 2013. [Google Scholar]
  • 14.Glaser EM. An experiment in the development of critical thinking. New York: Teacher’s College, Columbia University; 1941. [Google Scholar]
  • 15.American Philosophical Association. Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction—the Delphi report. Washington, DC: American Philosophical Association; 1990. [Google Scholar]
  • 16.Facione NC, Facione PA. Critical thinking and clinical judgment. In: Facione NC, Facione PA, editors. Critical thinking and clinical reasoning in the health sciences: an international multidisciplinary teaching anthology. Millbrae, CA: California Academic Press LLC; 2008. pp. 1–13. [Google Scholar]
  • 17.Commission on Dental Accreditation. Accreditation standards for dental education programs. Chicago: American Dental Association; 2010. [Google Scholar]
  • 18.California critical thinking skills test. [Accessed 22 July 2014];2013 At: www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST. [Google Scholar]
  • 19.HSRT & HSRT-N user guide and technical manual. San Jose, CA: Insight Assessment/California Academic Press; 2014. [Google Scholar]
  • 20.Pardamean B. Measuring change in critical thinking skills of dental students educated in a PBL curriculum. J Dent Educ. 2012;76(4):443–453. [PubMed] [Google Scholar]
  • 21.Huhn K, Black L, Jensen GM, et al. Construct validity of the health science reasoning test. J Allied Health. 2011;40(4):181–186. [PubMed] [Google Scholar]
  • 22.Dunning T. Improving causal inference: strengths and limitations of natural experiments. Polit Res Quart. 2008;61(2):282–293. [Google Scholar]
  • 23.Ali K, McHarg J, Kay E, et al. Academic environment in a newly established dental school with an enquiry-based curriculum: perceptions of students from the inaugural cohorts. Eur J Dent Educ. 2012;16(2):102–109. doi: 10.1111/j.1600-0579.2011.00728.x. [DOI] [PubMed] [Google Scholar]

RESOURCES