Abstract
Background:
Students in the Faculty of Dentistry at the University of British Columbia have articulated challenges in understanding learning objectives in their oral epidemiology and statistics course. This study aimed to measure the impact of a course renewal intended to enhance student learning. Examples of educational interventions included providing more time for activities, increasing student interactivity, and integrating more hands-on applicable exercises using statistical software.
Methods:
An online mixed-methods survey using a 5-point Likert scale and open-ended questions was distributed to 43 dental hygiene students before the course renewal and again to a second cohort of 43 students after course revisions. The survey asked students to rank their levels of challenge and self-confidence in learning 23 of the course objectives throughout each academic year. Four semi-structured interviews were also conducted with faculty and staff members involved in teaching or coordinating this course to understand their experiences after the course revisions.
Results:
Response rates were 32% to 57%. After the course renewal, the extent to which students in the entry-to-practice cohort felt extremely challenged to learn each objective was significantly reduced (25% vs. 3%, p < 0.001), and students’ self-confidence scores significantly increased (12% vs. 30%, p < 0.001).
The changes on the challenge and confidence scores in the degree-completion cohort were not statistically significant (23% vs. 24% and 31% vs. 36%, respectively). Student satisfaction levels increased in all 6 categories measured.
Conclusion:
Providing students with more time to absorb their learning, increasing interactivity, offering timely feedback, and integrating applicable exercises using statistical software resulted in an enhanced learning environment.
Keywords: biostatistics, computer-assisted instruction, dental hygiene, education, educational assessment, epidemiology
Abstract
Contexte :
Les étudiants de la Faculté de dentisterie de l’Université de la Colombie-Britannique ont exprimé les difficultés à comprendre les objectifs d’apprentissage de leur cours d’épidémiologie et de statistiques buccodentaires. La présente étude vise à mesurer l’effet du renouvellement d’un cours afin d’améliorer l’apprentissage des étudiants. Les exemples d’interventions éducatives comprennent : accorder plus de temps aux activités, augmenter l’interactivité étudiante, et intégrer plus d’exercices pratiques, applicables au moyen de logiciels statistiques.
Méthodologie :
Un sondage en ligne, effectué au moyen d’une méthodologie mixte qui utilise l’échelle de Likert en 5 points et des questions ouvertes a été distribué à 43 étudiants en hygiène dentaire avant le renouvellement du cours et une fois de plus à une deuxième cohorte de 43 étudiants après les révisions du cours. Au sondage, les étudiants devaient classer leurs niveaux de difficulté et de confiance en soi relativement à l’apprentissage de 23 des objectifs de cours, tout au long de chaque année universitaire. Quatre entrevues semi-structurées ont aussi été menées auprès du corps professoral et des membres du personnel qui participent à l’apprentissage ou à la coordination de ce cours en vue de comprendre leurs expériences à la suite des révisions du cours.
Résultats :
Le taux de réponse était de 32 % à 57 %. Après le renouvellement du cours, la mesure dans laquelle les étudiants de la cohorte d’entrée en pratique ont éprouvé de la difficulté à apprendre chaque objectif a été réduite de manière significative (25 % par rapport à 3 %, p < 0,001), et les cotes de confiance en soi des étudiants ont augmenté de manière significative (12 % par rapport à 30 %, p < 0,001). Les changements de cotes de difficulté et de confiance en soi de la cohorte qui termine son diplôme n’étaient pas statistiquement significatifs (23 % par rapport à 24 % et 31 % par rapport à 36 %, respectivement). Les niveaux de satisfaction des étudiants ont augmenté dans les 6 catégories mesurées.
Conclusion :
Offrir plus de temps aux étudiants pour assimiler leur apprentissage, augmenter l’interactivité, offrir de la rétroaction ponctuelle, et intégrer des exercices applicables à l’aide de logiciels statistiques ont créé un meilleur environnement d’apprentissage.
PRACTICAL IMPLICATIONS OF THIS RESEARCH.
Baccalaureate dental hygiene students are required to demonstrate an ability to understand and apply research methods and statistics in professional practice.
Statistics and epidemiology topics are difficult for undergraduate students to master, particularly in an online learning environment.
Incorporating active-learning strategies, opportunities for peer interaction and assessment, and timely, regular instructor feedback increase student self-confidence and decrease the challenges of the online learning process.
INTRODUCTION
Background
Oral Epidemiology, covering epidemiology and statistics topics, is a required 6-credit course in the Dental Hygiene Degree Program (DHDP) in the Faculty of Dentistry at the University of British Columbia (UBC). The course is offered annually to approximately 50 dental hygiene students within 2 distinct academic pathways: degree-completion (DC) and entry-to-practice (ETP).
UBC’s DHDP has offered DC education since 1992. It is intended for practising dental hygienists who have previously earned a dental hygiene diploma but who desire to return to university to earn a bachelor’s degree. DC students can enroll in either full- or part-time study and have up to 5 years to complete their degree exclusively through a distance education platform. Since 2007, the DHDP has offered Canada’s first 4-year ETP degree in which students enroll with no previous dental hygiene education. The ETP curriculum is offered through a blended format, with approximately one-quarter of the program’s course credits delivered online. Students in the ETP pathway enter the program either directly from secondary school or after some postsecondary education in other areas of study.
In 2015, the Canadian Dental Hygienists Association (CDHA) published the Canadian Competencies for Baccalaureate Dental Hygiene Programs report, which articulates a standard of educational outcomes for graduates of a 4-year degree program.1 This report communicates the knowledge and abilities required to meet the increasingly complex needs of the public in the 21st century in a variety of practice settings. In 2016, UBC’s DHDP was the first dental hygiene program in Canada to integrate these competencies into its curriculum.
In 2016, the Canadian National Dental Hygiene Certification Board revised its competency framework for the national dental hygiene certification examination. 2 The new framework requires dental hygiene graduates to demonstrate the ability to 1) use knowledge of the principles of research methods and statistics in dental hygiene practice and 2) assess epidemiological data
Dental Hygiene (DHYG) 401 Oral Epidemiology has been delivered online at UBC to third-year students since 2008. The course consists of 24 blocks of epidemiological and statistical content offered over 2 semesters (24 weeks total) and prepares students for their subsequent literature review courses, in which they are required to critique and synthesize literature into a comprehensive review paper. The course also helps prepare students to translate epidemiological and statistical knowledge into practice for their clients, enhancing their critical thinking skills. The competencies assessed in DHYG 401 include professionalism, communication, collaboration, coordination, leadership, and research use. The research use competency is defined as the ability to “use scientific knowledge to support evidence- and theory-based autonomous judgments and services.”1 Several of the subcompetencies in this domain require baccalaureate graduates to demonstrate their ability to 1) analyse the strengths and limitations of different research approaches; 2) examine the appropriateness of statistical tests based on the theories underpinning those tests; 3) critique the study methodology and conclusions for relevance; 4) weigh various biases and assumptions; and 5) differentiate between more and less credible types of information.1
Study rationale
Studies have demonstrated that statistics can be challenging for undergraduate students to learn,3, 4 and students in UBC’s DHDP were no exception. Over the past several years, through course evaluations offered by the university, dental hygiene students have expressed challenges in their achievement of the research use subcompetencies in DHYG 401. In these course evaluations, students have expressed: “The material in certain weeks was much too overwhelming” and “The foundations should be revisited before the stats testing strategies and methods are learned.” They also articulated concerns about the heavy workload and complexity of the course material.
To respond to students’ concerns and to better align with the DHDP’s newly integrated competency framework as well as the revised National Dental Hygiene Certification Board competency framework, the course instructor (BS) implemented a course renewal. The changes are categorized in 4 large groups: content and ordering, course materials, course activities, and course assessment. Table 1 summarizes the revisions made.
The course instructor (BS) employed a ,non-repeated measures design before and after the completion of this course renewal to measure the impact of these revisions. The objectives of this study were to assess the impact of the course revisions on students’ learning outcomes and on their self-confidence in statistical and epidemiological topics included in the course. The strategy proposed for teaching statistics and oral epidemiology, as well as the assessment method applied in this study, could be used as an exemplar for teaching a challenging subject in an online environment.
Table 1.
Summary of changes to the Oral Epidemiology course
|
Category |
Before revision (2015–2016) |
After revision (2016–2017) |
|
|
Content and ordering |
Block ordering |
A mixture of statistical and epidemiology topics starting with statistical ones. |
Resequenced: The first 4 blocks in term 1 are allocated to epidemiology topics that are easier to absorb followed by statistical topics in which the knowledge students gain in the first blocks are applied. |
|
Block objectives |
1. Block objectives were not aligned with the course objectives. 2. Block objectives were general and not measurable. For example, regarding the t-test, the objective was “learn how to do a t-test” |
1. Block objectives are aligned with the course objectives. 2. Block objectives are measurable and more aligned with the block activities and assessments.5 For example, regarding the t-test, the objectives are broken down to “explain the relative advantages of different types of t-tests” or “explain assumptions of a t-test.” |
|
|
Reflective thoughts |
None |
Added to the introduction section of each block. For example, in block 6 titled “Transition from Descriptive to Inferential Statistics,” the following reflective thoughts have been added: As you work through Block 6 think about: What are the properties of a normal curve? and What is the relationship between variance and standard deviation? |
|
|
Block reviews and self-assessments |
Were general and no answers provided |
Now aligned with the block objectives and detailed answers provided |
|
|
Overlap of the course objectives with other courses |
Overlapped with DHYG 461 Literature Review I (concurrent Year 3 course) |
Redundant learning objectives related to writing a review paper have been removed |
|
|
Course material |
The main statistics textbook (eBook) |
Was published in 2008 |
Updated to 2014 |
|
Educational media |
None |
Includes “hands-on” activities through the integration of the Statistical Package for the Social Sciences (SPSS) |
|
|
Required readings |
4 to 8 readings per block5 |
2 to 3 readings per block5 |
|
|
Educational videos |
Approximately 10 |
Increased to 36 |
|
|
Course activities |
Block activities |
1. 17 activities (1 per block5) 2. Teamwork activities in which team members would split the questions and combine answers to develop the team summary report |
1. 10 activities (1 for 1 to 3 topic-related blocks5) 2. Each student completes the activities individually and then posts them to the teamwork area for discussion and team summary report development |
|
Student–student interactions |
Limited interaction due to the workload, structure, and types of activities |
Sufficient time for students’ block activities created through: 1. Defining one set of activities per 1 to 3 topic-related blocks5 2. Allocating the whole block time (i.e., 1 week) for block activities |
|
|
Team organization |
6 to 7 students per team |
4 to 5 students per team |
|
|
Course assessment |
Peer-assessment activities |
Only instructor would assess the team’s assignment report |
Peer-assessment activities introduced to increase student engagement. For example, in the second assignment the teams develop a case analysis report in the first week. In the second week, each team assesses and grades the other team’s assignment report anonymously and, in the third week, they grade their own reports considering a key report provided by instructor as well as the peer team’s assessment report. (Peer assessment and self-assessment are common ways of involving students in the assessment process. Research shows that involving students in the assessment process encourages them to be more active in their learning and be responsible for their own learning.6) The instructor gives the final assignment grade. |
|
Assignments |
4 assignments (2 per term) |
1. 2 assignments (1 per term) 2. The format of the assessment is now as follows:
|
|
|
Instructor’s feedback on assignment and block activities |
No specific timeline |
For block activities: Feedback is provided within 1 to 2 days after the class discussions are over For assignments: Feedback is provided within 1 to 2 weeks after submission |
|
METHODS
This project received an exemption from UBC’s Behavioral Research Ethics Board review as it was deemed to be quality assurance as part of a course and program evaluation. UBC dental hygiene ETP and DC students in the academic years 2015–2016 and 2016–2017, taking a full-year online course (DHYG 401), participated in this study. The evaluation used pre- and post-revision observations made on 2 successive cohorts of students to assess the effectiveness of course revisions. Online self-assessment surveys were distributed by program assistants to all students enrolled in the course through UBC FluidSurveys upon completion of the first semester (January) and again upon completion of the course (April). The first survey consisted of 13 course objectives in term 1; the second survey consisted of 10 items covering term 2 content. Surveys were completed individually and anonymously.
The surveys consisted of 2 main questions asking students to rank their level of challenge and confidence on 23 selected course objectives. These were the main objectives of the course both before and after the revision, which made the comparison of the 2 cohorts possible. Students were asked to answer the questions: “How challenging was each skill/knowledge to acquire?” and “How confident are you in your current skill/knowledge in this area?” A 5-point Likert scale was used, including not, slightly, moderately, very, and extremely challenging or confident. Owing to the small sample size, Likert scale categories were grouped together to form 3 categories for data analysis and presentation: not/slightly, moderately, and very/extremely challenging or confident. The questionnaires appear in Appendices A and B.
Descriptive statistics were used to identify students’ challenge level as well as their confidence in the 2 pre- and post-course revision cohorts. Fisher’s Exact Test was used to compare categorical data. All data were analysed using SPSS for Windows, Version 24.0 (IBM Corp., Armonk, NY, USA).
Qualitative data stemmed from open-ended questions asked at the end of the survey about students’ satisfaction with the course workload, pace, depth of knowledge, interactivity with peers, focus on student-centred learning, and course resources/materials as well as what they liked most and least about the course. In addition, semi-structured individual interviews were conducted with 4 faculty and staff members in order to understand experiences of the impact of the course revisions in greater depth. Interviews were audiorecorded with consent, transcribed verbatim, and coded for thematic analysis. Interview participants were purposefully selected based on their knowledge of the course and interactions with students in both cohorts.
RESULTS
Study population
Course enrollment consisted of 43 students each year. Response rates varied by year, cohort, and survey time, ranging from 32% to 57%. Table 2 summarizes the number of completed surveys and response rates by year.
Survey results
Students’ level of challenge and confidence on all course objectives
A summary of students’ confidence levels and perceived levels of challenge for all 23 selected course objectives is presented in Figure 1.
ETP students who enrolled in the revised course in 2016–2017 provided significantly lower Likert scores on all 23 course objectives regarding how challenging it was to learn the content compared with the 2015–2016 pre-course-renewal cohort (p < 0.001). They selected the not/slightly challenging option for 81% and 45% of objectives, respectively.
In the DC group, the changes in the level of challenge required to learn course objectives after course revisions were also considerable, with a 14% increase in the not/slightly challenging category (from 37% to 51%; p < 0.05) and a 15% decrease in the moderately challenging category (from 40% to 25%; p < 0.05). There was almost no change in the very/extremely challenging category (23% and 24%, respectively).
As presented in Figure 1, the proportion of ETP students who felt very/extremely confident in learning all objectives increased more than two-fold after course revision (from 12% to 30%; p < 0.001). Concurrently, the DC students’ confidence rate increased slightly but not significantly after course revisions (from 31% to 36%).
Figure 1.
The proportion of students’ ratings for levels of challenge and confidence for all 23 selected course objectives
Table 2.
Entry-to-practice and degree-completion students’ response rate by year
|
Cohort |
Before revision (2015–2016) (N = 43) |
After revision (2016–2017) (N = 43) |
||
|
January survey |
April survey |
January survey |
April survey |
|
|
Entry to practice |
12 of 22 students (55%) |
7 of 22 students (32%) |
10 of 25 students (40%) |
13 of 25 students (52%) |
|
Degree completion |
12 of 21 students (57%) |
9 of 21 students (43%) |
7 of 18 students (39%) |
6 of 18 students (33%) |
|
Total |
24 (56%) |
16 (37%) |
17 (40%) |
18 (44%) |
Students’ level of challenge by individual course objectives
Comparisons of the distributions of students’ responses before and after the course renewal identified a decrease in the level of challenge associated with learning several objectives among ETP students after the course revisions. Of the 23 objectives studied, 11 (48%) demonstrated significant (<0.05) or borderline (<0.1) p values after revision (Table 3). The percentage of students who expressed slight/no challenge in learning the “explain skewness and kurtosis” objective increased from 50% to 100%. For the “explain the application of common non-parametric tests” objective there was a 67% increase (from 25% to 92%) in the slight/no challenge learning experience.
For DC students, a significant reduction in the challenge level occurred in one objective (4.3%) after revisions (<0.05), while in 3 objectives the p value was borderline (<0.1) (Table 3).
Students’ level of confidence by individual course objective
Comparisons of the distributions of students’ responses before and after the course renewal identified an increase in the level of confidence associated with learning a smaller number of course objectives among ETP students after the course revisions. Of the 23 objectives studied, 8 (35%) showed significant (<0.05) or borderline (<0.1) p values after revision (Table 4). There were significant increases in the number of students who expressed high/very high confidence scores in 3 of 8 objectives presented, including “Distinguish between descriptive and inferential statistics,” which showed a 52% increase (from 8% to 60%), “Classify data based on typologies of data,” which showed a 45% increase (from 0% to 45%), and “Describe the structure associated with hypothesis testing,” which showed a 40% increase (from 0% to 40%).
Before the course revisions, DC students identified 3 objectives in which none of them felt highly confident: “explain circumstances that permit correlation analysis,” “explain circumstances that permit regression analysis,” and “explain how diagnostic test data are assessed.” After the revisions, students’ confidence on the last objective remained low but for the first 2 the students indicated they were more confident. Nevertheless, there were 2 other objectives in which the students continued to lack confidence after revisions “describe the importance of conducting a power calculation” and “explain basic concepts in epidemiology related to causation.” None of the changes were significant in DC students; therefore, no objective is presented for these cohorts in Table 4.
Figure 2.
Proportion of students reported to be satisfied/very satisfied with different aspects of the course before (year 1) and after (year 2) course revisions
Impact on students’ satisfaction
Students were asked to rate their levels of satisfaction with the course workload, pace, depth of knowledge, student to student interactivity, focus on student-centred learning, and course resources/materials. As Figure 2 illustrates, a significantly greater proportion of students were satisfied with all aspects of the course after the course renewal. Most significantly, 58% of students reported satisfaction with the course workload after the course renewal compared to only 6% before course revisions (p < 0.001). Student satisfaction scores increased in all other categories (53% to 74%) after the course renewal compared to before course revisions (38% to 50%).
Table 3.
Frequency of students’ responses for selected objectives by level of challenge, cohort, and year
|
Course objectivesa |
Before revision (2015–2016) |
After revision (2016–2017) |
|||||
|
Not/slightly challenging |
Moderately challenging |
Very/extremely challenging |
Not/slightly challenging |
Moderately challenging |
Very/extremely challenging |
P valueb |
|
|
% (n/total) |
% (n/total) |
||||||
|
Entry to practice |
|||||||
|
Explain skewness and kurtosis |
50% (6/12) |
33% (4/12) |
17% (2/12) |
100% (10/10) |
0% (0/10) |
0% (0/10) |
0.028 |
|
Describe the structure associated with hypothesis testing |
25% (3/12) |
25% (3/12) |
50% (6/12) |
90% (9/10) |
10% (1/10) |
0% (0/10) |
0.004 |
|
Explain basic epidemiological study types (both experimental and non-experimental) |
33% (4/12) |
50% (6/12) |
17% (2/12) |
90% (9/10) |
10% (1/10) |
0% (0/10) |
0.028 |
|
Explain strengths and limitations of study designs |
42% (3/7) |
29% (2/7) |
29% (2/7) |
57% (4/7) |
10% (1/10) |
0% (0/10) |
0.085 |
|
Describe the importance of conducting a power calculation |
17% (3/12) |
25% (2/12) |
58% (7/12) |
50% (5/10) |
40% (4/10) |
10% (1/10) |
0.061 |
|
Explain circumstances that permit correlation analysis |
14% (1/7) |
29% (2/7) |
57% (4/7) |
67% (8/12) |
25% (3/12) |
9% (1/12) |
0.058 |
|
Explain circumstances that permit regression analysis |
14% (1/7) |
29% (2/7) |
57% (4/7) |
75% (9/12) |
0% (0/12) |
25% (3/12) |
0.016 |
|
Identify the non-parametric tests which compare to the parametric tests |
29% (2/7) |
71% (5/7) |
0% (0/7) |
92% (12/13) |
8% (1/13) |
0% (0/13) |
0.007 |
|
Explain the application of common non-parametric tests |
25% (3/12) |
42% (5/12) |
33% (4/12) |
92% (11/12) |
8% (1/12) |
0% (0/12) |
0.003 |
|
Recognize possible misapplications of statistical tests |
14% (1/7) |
29% (2/7) |
57% (4/7) |
70% (9/13) |
30% (4/13) |
0% (0/13) |
0.007 |
|
Explain how diagnostic test data are assessed |
0% (0/7) |
71% (5/7) |
29% (2/7) |
77% (10/13) |
23% (3/13) |
0% (0/13) |
0.002 |
|
Degree completion |
|||||||
|
Explain measures of central tendency and dispersion |
42% (5/12) |
50% (6/12) |
8% (1/12) |
86% (6/7) |
0% (0/7) |
14% (1/7) |
0.047 |
|
Explain skewness and kurtosis |
42% (5/12) |
58% (7/12) |
9% (0/12) |
72% (5/7) |
14% (1/7) |
14% (1/7) |
0.097 |
|
Explain tests which would be appropriate for comparisons of more than 2 groups |
8% (1/12) |
50% (6/12) |
42% (5/12) |
66% (4/6) |
17% (1/6) |
17% (1/6) |
0.057 |
|
Explain the application of common non-parametric tests |
17% (2/12) |
33% (4/12) |
50% (6/12) |
80% (4/5) |
0% (0/5) |
20% (1/5) |
0.056 |
aThe level of challenge was investigated on 23 objectives. However, due to space limitations, only objectives having significant (<0.05) or borderline (<0.1) p values are presented. A sample of all 23 objectives in the questionnaire is provided in Appendix A.
bFisher’s Exact Test
Qualitative feedback
After the course revisions, both ETP and DC students felt the assignments were well spaced and enjoyed collaborating on teamwork. One student said, “The way the group work was done was the best I’ve ever experienced in all of my previous education. I love how we had to do all the assignments and learn it ourselves, then come together as a group to put it together.” The general themes that emerged from the qualitative comments on the surveys indicated that students valued having more time to absorb their learning and found meaning in working with SPSS software as part of the revised course activities. Both groups of students expressed increased satisfaction with course expectations and course load manageability post-revisions, while several students commented that they liked the pace of the course, which was a change from the previous year’s comments. After revisions, some students still expressed that they were overwhelmed by the course material, but there was far less confusion about specific aspects of the course. Some ETP students’ comments after the course revisions pertained to use of the SPSS software, which will be revised based on their feedback. Comments from students in response to the questions “What did you like most?” and “What did you like least?” about the course are presented in Table 5.
Prior to the course renewal, 29% of ETP students (2 of 7) reported that “they would revisit course materials in the future.” This percentage increased to 69% (9 of 13) after the course renewal, but the difference was not significant. In DC students, before revision of the course all but one of the students (89%) said they were “likely to revisit the course materials.” This percentage increased to 100% after revisions. The increase was not statistically significant. Before the revisions more than half of the ETP and DC students said the course would change the way they engage in relevant activities (57% and 56%, respectively). After revision, this changed to 54% in ETP students and 83% in DC students. The changes were not statistically significant.
Four faculty and staff members were interviewed after the course revisions. Their feedback confirmed that there was a notable improvement in the capability of students who were taking the revised version of the course. One program coordinator noted a difference in the amount of support students required over the course of the term:
As far as feedback goes, at least anecdotally, I was asked to do less follow-up with their statistics. I do a weekly integration course where I kind of tie everything together and how it applies to clinic, etc. And I would often get students coming in and I would do a session where I’d just try to bring stats down to a real basic, use this, use that, bit more of a recipe approach. I’d often get asked “can we do this again,” and then they’d come in and ask me questions. I had less of that this year.
Table 4.
Frequency of students’ responses on selected objectives by level of confidence, cohort of students, and year
|
Course objectivesa |
Before revision (2015–2016) |
After revision (2016–2017) |
|||||
|
Not/slightly confident |
Moderately confident |
Very/extremely confident |
Not/slightly confident |
Moderately confident |
Very/extremely confident |
P valueb |
|
|
% (n/total) |
% (n/total) |
||||||
|
Entry to practicec |
|||||||
|
Distinguish between descriptive and inferential statistics |
8% (1/12) |
84% (10/12) |
8% (1/12) |
0% (0/10) |
40% (4/10) |
60% (6/10) |
0.02 |
|
Classify data based on typologies of data |
42% (5/12) |
58% (7/12) |
0% (0/12) |
22% (2/9) |
33% (3/9) |
45% (4/9) |
0.048 |
|
Describe approaches to random sampling and its strengths and challenges |
25% (3/12) |
67% (8/12) |
8% (1/12) |
30% (3/10) |
20% (2/10) |
50% (5/10) |
0.052 |
|
Describe the structure associated with hypothesis testing |
67% (8/12) |
33% (4/12) |
0% (0/12) |
20% (2/10) |
40% (4/10) |
40% (4/10) |
0.02 |
|
Explain basic epidemiological study types (both experimental and non-experimental) |
58% (7/12) |
25% (3/12) |
17% (2/12) |
10% (1/10) |
50% (5/10) |
40% (4/10) |
0.07 |
|
Describe the importance of conducting a power calculation |
92% (11/12) |
8% (1/12) |
0% (0/12) |
50% (5/10) |
40% (4/10) |
10% (1/10) |
0.07 |
|
Explain the application of common non-parametric tests |
50% (6/12) |
50% (6/12) |
0% (0/12) |
23% (3/13) |
39% (5/13) |
39% (5/13) |
0.05 |
|
Explain potential sources of error in studies: bias and confounding |
72% (5/7) |
14% (1/7) |
14% (1/7) |
15% (2/13) |
62% (8/13) |
23% (3/13) |
0.06 |
aConfidence levels were investigated in 23 objectives. However, due to space limitations, only the objectives exhibiting significant (<0.05) or borderline (<0.1) p values are presented.
bFisher’s Exact Test
c p values were non-significant in all objectives in the degree completion cohort.
Table 5.
Students’ feedback by question, cohort, and years
|
Entry to practice |
Degree completion |
|||
|
Before revisions (2015–2016) (n) |
After revisions (2016–2017) (n) |
Before revisions (2015–2016) (n) |
After revisions (2016–2017) (n) |
|
|
What did you like most about the course? |
|
|
|
|
|
What did you like least about the course? |
|
|
|
|
aSPSS has been provided free of charge to UBC students since summer 2020.
Staff also felt there were fewer students needing follow-up with course content and expressing challenges about course difficulty after renewal of the course. Another staff member described the challenges particularly for degree completion students: “I remember that first term was heavily laden with statistics and students would really suffer…I remember degree completion students that were contemplating dropping the course because it was so difficult, and I would tell them term 2 is going to be so much easier. If you can, hang in there, keep going.” She went on to say, “I really think there have been less concerns, less complaints [since the course was revised].” A program advisor said tutorials were beneficial for the students, and that overall she “heard less complaining and there was definitely less e-mails of ‘can you explain this to me.’”
Additionally, there was consensus across interviews that course changes also helped reduce student anxiety as the changes helped students manage their academic schedule and other concurrent academic demands. A staff member explained how students “would sometimes contact me and say I’m really struggling in this course. What should I do?” While this staff member did not deal with course content, she did assist students with external struggles and was therefore familiar with the many challenges that students face, particularly in third year, which she described as a tough year. “My perception is [course revision] was very successful reducing the intensity of the third year and the [intensity] of the course as well,” she noted. The instructor (BS) identified remarkable progress in the quality of assignment reports in the course:
When you start a program, the knowledge and the skills build up on each other and critical thinking is the abstract or summary of every knowledge and skill that you have learned to put together, to be able to think critically. So I think that if [the students] use the knowledge that they learn in this course and the skills in the next courses, even in the clinic, or the publications that they read, their critical thinking skills will improve a lot because this is a new thing that they learned in this course.
DISCUSSION
Oral epidemiology and associated statistics have historically been challenging subjects for students in UBC’s DHDP. Owing to significant demographic differences between UBC’s ETP and DC students in terms of their background knowledge, age, and practice experiences, results between these 2 student cohorts were not compared.
To the best of our knowledge, the number of studies that assess the impact of course revisions in epidemiology and statistics courses is sparse particularly in a dental education context.
After the course revisions, there was a significant reduction in the number of individual learning objectives that both ETP and DC students scored as very/extremely challenging to learn. In addition, students’ confidence that they had acquired the skill/knowledge required to meet the course objectives increased significantly among ETP students. Consistent practice using SPSS could be one reason for these outcomes. After the revision, students were also provided with additional time to absorb each “week’s” or “block’s” content first and were subsequently asked to participate in online group activities. In order to provide time for students, one set of activities to meet the learning objectives were designed over 2 to 3 consecutive weeks compared with only 1 week prior to the course revisions. All these revisions would explain the difference in the level of challenge and confidence before and after the modification.
One way to help students develop their statistical reasoning and meet the “research use” course competencies is to incorporate active-learning strategies that allow students to supplement what they have heard and read about statistics by actually doing something, e.g., designing studies, collecting data, analysing their results, preparing written reports, and giving presentations.7,8 In addition,Guidelines for Assessment and Instruction in Statistics Education recommend that teachers apply statistical software using real data on relevant topics to motivate students to think about statistical concepts.9 Therefore, the revised course integrated regular online exercises using SPSS, which provided students with frequent opportunities to apply their knowledge of statistical tests using different cases. Similarly, Gonzalez et al.10 integrated a web-based statistical learning tool to improve dental student performance in statistics.They also found that integrating frequent exercises using this tool (e-status) resulted in higher academic scores and levels of student satisfaction. In another study performed by Basturk11 the effectiveness of computer-assisted instruction (CAI) on students’ learning was assessed. He performed a quasi-experimental study on 205 graduate-level students to compare the learning outcomes of teaching an introductory statistics course using lecture-plus-CAI (SPSS) with lecture-only methods. He concluded that there was marked improvement in students’ learning when CAI supplemented the teaching method.11
Furthermore, student learning improvements could be due to the formatting of student interactions in the renewed version of the course. Students were asked to participate in the block activities individually, and then 2 consecutive online discussion areas (among team members and then the whole class) were provided. In 2013, Salmi12 argued that, in an online course, the direct interaction between instructor and students is not as important as appropriately planning students’ activities and working on assignments. Students showed a positive attitude towards peer interaction in her study.12 In addition, the way the second course assignment was designed could be effective. The instructor (BS) implemented a peer-assessment model to support learners in the development of their roles as health care professionals and, in particular, in developing decision-making abilities that included taking responsibility for one’s own learning, providing constructive feedback, contributing to the learning of others, and reflecting on self and peer performance.
Continuous peer-assessment activities throughout the revised course provided an important mechanism for students to demonstrate their acquisition of the skill commensurate with a dental hygiene professional and helped close the gap between current and desired competencies.13 The second case analysis assignment in the revised course was developed with a peer-assessment focus. In this assignment, students become assessors within the context of participation in practice and become actively engaged in their own learning, metacognitive skills, and a dialogical, collaborative model of teaching and learning.14 Rayens and Ellis15 shared their experiences in creating an online statistics course at the University of Kentucky and the use of rubric-driven peer grading as a successful way to enhance students learning: “By using rubric-driven peer grading, we are able to gently force an encounter between the student and a rubric and, hence, more effectively communicate what was valued about the assignment.”15 Peer assessment activities were incorporated into the revised course (Table 1).
A significant increase in the students’ satisfaction scores pertaining to the course workload was noted. In the revised course, students were provided with ample time to first absorb the block contents and then subsequently to participate in online group activities. In order to provide time for students, one set of activities was designed for 2 to 3 topic-related blocks offered over 2 to 3 consecutive weeks, compared with weekly activities offered prior to the course revisions. Time is an essential requirement for studying and learning. According to Karjalainen et al., “workload is appropriate when students are provided with enough time for completing learning tasks and learner capacity is taken into account. An overly packed schedule does not enable effective learning but results in student overload and superficial learning.”16
Finally, in the revised course, the instructor offered feedback to clarify students’ ambiguities expressed during team and class discussions and would post the notes promptly upon completion of discussions on each set of block activities. Gibbs and Simpson17 have considered feedback under the conditions through which student learning could be supported by assessment. In addition, Krause et al.18 have concluded that feedback is especially beneficial for students with little prior knowledge of statistics. Guidelines for Assessment and Instruction in Statistics Education College Report confirms the importance of feedback on students’ learning of statistics, stating: “Useful and timely feedback is essential for assessments to lead to learning.”9 They did not define “timely.”
The increase in the number of objectives in which students felt very/extremely confident was not significant in the DC cohort. Despite this current study’s findings, literature demonstrates that dental hygienists who have earned a degree have been integrating their knowledge in this subject area into their professional practice more confidently. Research on the outcomes of dental hygiene baccalaureate education shows that learners’ critical thinking and research use skills are enhanced when compared with outcomes of diploma level education for dental hygienists. Dental hygienists who have first earned a diploma and then returned to university to earn their dental hygiene degree have commented on their strengthened evidence-based decision-making abilities as well as their proficiency in finding research and assessing its credibility and applicability to practice.19-21
Study limitations
One limitation was the timing of the study in the academic year. Access to students was limited at the end of the semester and the availability and willingness of students to participate in a focus group discussion was low, despite the incentives provided. In addition, comparing the impact of revisions on ETP and DC students was not possible because of a lack of access to their level of knowledge and confidence on statistical topics before the study timeframe. Moreover, the pre- and post-revision grades were not comparable since the exam questions were not identical. The small sample size, as well as low response rate, may represent a non-response bias. In other words, the students who were more challenged or felt less confident might not have responded. More studies with larger sample sizes are needed to evaluate the impact of revisions to statistics courses, particularly in dental education.
CONCLUSION
For several years, students in the Faculty of Dentistry at UBC have found the dental hygiene program’s oral epidemiology and statistics course and their ability to demonstrate the associated learning objectives to be challenging. This study reported the impact of a course renewal on student learning. Course revisions included the incorporation of additional time to absorb learning, increasing student interactivity through online discussions, increasing rubric-driven peer-assessment opportunities, providing timely feedback, and integrating frequent applicable statistical exercises using SPSS. Surveys to measure the effect of these revisions on student learning assessed the level of challenge involved in demonstrating course objectives as well as the students’ level of confidence in their grasp of the subject. Students in the revised course expressed greater levels of confidence in demonstrating course objectives and felt less challenged to learn the content. They also valued greater opportunities to engage in online discussions, to assess their peers, and to apply their knowledge. The delivery format of this course and methodology used to assess this course renewal may serve as a framework to inform the assessment of curricula in other health programs.
CONFLICT OF INTEREST
All authors declare no conflict of interest.
APPENDIX
Acknowledgments
This study was supported by a Scholarship of Teaching and Learning (SoTL) grant offered by the UBC Institute for the Scholarship of Teaching and Learning (SOTL) and UBC’s Centre for Teaching, Learning, and Technology (CTLT). The authors would like to acknowledge Firas Moosvi’s assistance in graph development. Also, the authors would like to acknowledge Heike Kilian and Siobhan Ryan, UBC Faculty of Dentistry program assistants, for their assistance with data collection.
Footnotes
CDHA Research Agenda category: capacity building of the profession
References
- 1. Canadian Dental Hygienists Association (CDHA). Canadian competencies for baccalaureate dental hygiene programs.Ottawa:CDHA;2015. [cited 2019 May 21]. Available from:https://files.cdha.ca/profession/CCBDHP_report.pdf [Google Scholar]
- 2. National Dental Hygiene Certification Board (NDHCB). Blueprint for the national dental hygiene certification examination.Ottawa:NDHCB;2016. [cited 2019 May 21]. Available from:https://www.ndhcb.ca/preparing-for-the-exam. [Google Scholar]
- 3. Parashar D Challenges in mathematics and statistics teaching underpinned by student–lecturer expectations Eur J Sci Math Educ 2014;2(4):202–219Available from:https://files.eric.ed.gov/fulltext/EJ1107657.pdf [Google Scholar]
- 4. Sowey E, Petocz P A panorama of statistics: Perspectives, puzzles and paradoxes in statistics.Oxford, UK:John Wiley & Sons;2017. [Google Scholar]
- 5. Biggs J.Aligning teaching for constructing learning [White paper]. Heslington, UK:The Higher Education Academy;2003. Available from:https://www.heacademy.ac.uk/sites/default/files/resources/id477_aligning_teaching_for_constructing_learning.pdf [Google Scholar]
- 6. Falchikov N Involving students in assessment Psychol Learn Teach 2004;3(2):102–108 [Google Scholar]
- 7. Chance BL Components of statistical thinking and implications for instruction and assessment J Stat Educ 2002;10(3) Available from: [Google Scholar]
- 8. Smith G Learning statistics by doing statistics J Stat Educ 1998;6(3) Available from: [Google Scholar]
- 9. GAISE College Report ASA Revision Committee Guidelines for assessment and instruction in statistics education College report 2016 Alexandria, VA:American Statistical Association; 2016. [cited 2019 April 8]. Available from: http://www.amstat.org/education/gaise [Google Scholar]
- 10. Gonzalez JA, Jover L, Cobo E, Munoz P A web-based learning tool improves student performance in statistics: A randomized masked trial Comp Educ 2010;55(2):704–713 [Google Scholar]
- 11. Basturk R The effectiveness of computer-assisted instruction in teaching introductory statistics Educ Tech Soc 2005;8(2):170–178 [Google Scholar]
- 12. Salmi L Student experiences on interaction in an online learning environment as part of a blended learning implementation: what is essential? In: Nunes MB, McPherson M, eds. Proceedings of the IADIS International Conference e-Learning 2013, Prague, Czech Republic, 22-26 July 2013:356–60 [Google Scholar]
- 13. McCracken J, Cho S, Sharif A, et al. Principled assessment strategy design for online courses and program The Electronic Journal of e-Learning 2012;10(3):107–119 [Google Scholar]
- 14. Boud D, Falchikov N Aligning assessment with long-term learning Assess Eval High Educ 2006;31(4):399–413 [Google Scholar]
- 15. Rayens W, Ellis A Creating a student-centered learning environment online J Stat Educ 2018;26(2):92–102 [Google Scholar]
- 16. Karjalainen A, Alha K, Jutila S.Give me time to think: Determining student workload in higher education.Oulu:Oulu University Press;2006. [Google Scholar]
- 17. Gibbs G, Simpson C Conditions under which assessment supports students’ learning Learn Teach Higher Educ 2005;1:3–31 [Google Scholar]
- 18. Krause U-M, Stark R, Mandl H The effects of cooperative learning and feedback on e-learning in statistics Learn Instr 2009;19(2):158–170 [Google Scholar]
- 19. Kanji Z, Sunell S, Boschma B, et al. Outcomes of dental hygiene baccalaureate degree education in Canada J Dent Educ 2011;75(3):310–320 [PubMed] [Google Scholar]
- 20. Sunell S, McFarlane RD, Biggar HC Differences between diploma and baccalaureate dental hygiene education in British Columbia: A qualitative perspective Int J Dent Hyg 2016;15:236–48 [DOI] [PubMed] [Google Scholar]
- 21. Kanji Z, Laronde DM Motivating influences and ability-based outcomes of dental hygiene baccalaureate education in Canada Int J Dent Hyg 2018;16(3):329–339 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.


