Abstract
Introduction
Personalized learning has been shown to improve learning outcomes. The aim of this pilot was to test a tool embedded in the Canvas© learning platform to improve personalization and collect data to see if the available level of personalization employed improved learning outcomes.
Methods
A nursing pathophysiology and pharmacology course was redesigned using the Canvas© Mastery Paths feature to provide personalized learning content to students. Post-class quiz grades were used to trigger Canvas© to conditionally release content review materials to students who did poorly on the quiz and provide a second quiz to test the efficacy of supplementary review materials. Data from the redesigned course was compared to the previous semester's course data.
Results
Use of Canvas© Master Paths to conditionally release supplementary material to poorly performing students resulted in significant improvement in course grades and certain activity scores correlated with improved course and ATI© quiz means.
Conclusion
The degree of personalization of course content available with Master Paths has the potential to have a positive impact on learning outcomes.
Keywords: personalization, nursing education, Canvas© mastery, learning management system, learning outcome
Introduction
Most classes consist of students with diverse skills and experience, making it important to personalize the delivery of course content in ways designed to meet individual student needs. Personalization of content and content delivery are not new ideas (Cronbach, 1957). In face-to-face classes, instructors often receive visual or verbal cues from students indicating their level of understanding. Good instructors adapt their content delivery based on these cues. However, it can be difficult to meet the needs of every individual student versus the median student. Students needing more or different content often miss learning opportunities if a course does not include personalization. Technological advances enable more and more sophisticated methods of personalization of content delivery that can be leveraged to overcome these challenges.
For this pilot study, a nursing pathophysiology and pharmacology course was redesigned using a tool embedded within Canvas©, the learning management system (LMS) used at the author's university which functions to manage course content organization, content distribution, testing, and grading. The aim of this pilot was to test this tool (named Master Paths) and collect data to see if the available level of personalization employed improved learning outcomes. Grades on all activities in the course and measures of course behavior (page clicks, assignment submissions, etc.) as measured by Canvas© were collected. Data regarding the types and use of study materials to help improve personalization in future classes were also collected.
Brief Review of Topic
Personalized learning (PL) is a method of instruction in which “learning objectives, instructional approaches, and instructional content (and its sequencing) may all vary based on learner needs” (U.S. Department of Education, 2017). Studies indicate that students seem to prefer PL teaching methods in all modes of course delivery, including face-to-face (Clark & Kaw, 2020; Hinkle et al., 2020), and that PL was shown to be useful in improving learning outcomes (Geng et al., 2021). Other studies demonstrate PL improved learning outcomes, such as course completion or grades (Kellman & Krasne, 2018; Liu et al., 2017; Presti & Sanko, 2019). While PL can also refer to a method of learning content delivery using technologies to create this more tailored learning experience, adaptive learning (AL) and personalized adaptive learning (PAL) are terms more specific to this method.
In AL/PAL, data about a student's performance in various activities is used to adjust the content delivered to the student (Fariani et al., 2022). The level of sophistication of this adjustment of content delivery depends mainly on the complexity of the technology used (Mikic et al., 2022). Many AL platforms use complex algorithms, including machine learning algorithms, to enable this personalization; however, these can be expensive and/or more difficult for instructors to implement (Mikic et al., 2022; Van Schoors et al., 2022). Simpler learning platforms on LMSs, like Canvas© may have algorithms or decision trees that do not allow as complex personalization of the learning experience but are more frequently being included as a part of LMSs that many colleges and universities already use and do allow for some degree of personalization (Heng et al., 2021; Kraleva et al., 2019).
Canvas© uses a rule-based algorithm in their Mastery Paths or “conditional release” feature, allowing an instructor to integrate some personalization based on quiz or assignment scores (Corris, 2020). Instructors can use this feature to create a path or paths through content using grades on specific quizzes or assignments that trigger the release of content based on a predetermined range of scores earned (Paradiso & Chen, 2021). Content that can be released includes information contained within a page in Canvas©, a quiz, an assignment, or a discussion board. The paths created may be simple 1 or 2 steps of additional content or may be activities through multiple steps that take a student through an entire course.
Analytics of student behavior are also being included in many LMS platforms, including Canvas©. Canvas© calls this feature “New Analytics” and uses student behaviors in the LMS along with assessment scores to help instructors identify students struggling in a course. The student behavior measured in Canvas is called Weekly Online Activity. This measure displays student “page views,” or how many times a student clicks on a particular page within the course. It also measures “participations” when a user performs a task within a course, such as taking or submitting a quiz, submitting an assignment, or submitting a discussion post. An instructor can then view this activity and compare it to the course average, individual student assignments, or quiz grades to identify at-risk students (Mcmillan, 2021).
Methods
The goal of this study was to evaluate the Canvas© Mastery Paths feature to see if it could feasibly work as a method to add some personalization to coursework and to collect data to see if this led to improved learning outcomes. In the 2-month-long preparatory phase of this study, an undergraduate nursing pathophysiology and pharmacology course (part II) was redesigned using the Canvas© Mastery Paths as outlined in Figure 1 and detailed more specifically below. The course was designed such that each week, an online quiz (called Post Class Quiz) would be due three days after students had a face-to-face class. If students earned less than 90% on this Post Class Quiz, supplemental content and a second quiz (called Second Chance Quiz) would become available that was due by the end of the week. Students could earn points toward a grade item called Demonstrate Knowledge from the scores on the Post Class Quiz or by the Second Chance Quiz. Scores of 90% or higher on the Post Class Quiz would earn full points towards the Demonstrate Knowledge grade, while lower scores on the Post Class Quiz would earn increasingly lower points on the Demonstrate Knowledge grade until the score was <60%, at which point no points were earned toward the Demonstrate Knowledge grade. Students could improve their Demonstrate Knowledge grade by taking the Second Chance Quiz. The intent of the conditional release of content for poorly performing students was to provide supplemental learning materials to students who did not demonstrate mastery of content based on the Post Class Quiz grade as well as provide a second opportunity via the Second Chance Quiz to earn up to full points on the Demonstrate Knowledge grade. The supplementary material was to be delivered using the Mastery Paths feature in Canvas©, that allows faculty to deliver conditionally released content in the form of “Pages” based on, in this case, quiz grades. Graded assignments or discussion boards can also be used to trigger conditionally released content in Canvas© but were not used in this course.
Figure 1.
Pilot of course redesign: personalization using mastery paths.
The next phase of the study was deployment of the course redesign into the Canvas© learning platform utilizing the Mastery Paths feature in the Spring semester 2022. This was a relatively straightforward process once the course was designed, taking only a few weeks. The content review materials were delivered as a Page within Canvas© since this is currently the only item that can be included as a conditional release in Canvas© based on quiz grades. “Pages” are a static content delivery method but can include text, video, and links to both internal course content and external websites. Pages were created for each module using the same layout each week for consistency. Each Page contained links to the course content for that module and essential study tips available to all students for the duration of the course. The remainder of the Page included resources not available to all students, but only to students who received the Page as part of the Mastery Paths conditional release of material. This content included instructor-designed brief videos reviewing physiology concepts important to that module's content, a curated selection of links to videos from sources available freely online, links to websites with mnemonics or images potentially helpful to students, and links to other websites that have relevant content such as links to pages from the Center for Disease Control or the American Diabetes Association.
The data collection phase began with the start of the course and occurred over the course of a standard Spring semester. Students were asked at the end of each Second Chance Quiz to identify which resources were used to help prepare for that quiz, including reading the textbooks, reviewing handouts/class notes, using student instructor study guide materials, reviewing the content review page material (and specifics about what material was used such as the videos versus text-based information), and “other” (text box provided for narrative response). Data were analyzed at the conclusion of the course and data from the previous fall semester course data were compared to the redesigned course.
To examine how well adding the PL element described above to the course impacted learning outcomes, the course grades from the newly designed course were compared to the previous semester course. Overall, group course grade and exam grade means were compared. To examine Canvas©'s New Analytics feature, correlations were examined between the course grades and New Analytics measures (course activity, missing assignments, and late assignments). Course activity is determined by “page views” (how many times a student clicks on a page within the course) and “participation” (how many times a student clicks on an assignment). Correlations were also examined, looking at relationships between different learning outcome parameters (Post Class Quiz grades, Second Chance Quiz grades, etc.) Data regarding all materials used to study by students were gathered using a multiple-choice question and a free text question after every Second Chance Quiz.
Analysis
Comparisons Between Fall 2022 and Spring 2022 Course Grades
There was a significant difference in course grade means between the fall (no PL) and spring (with PL) based on t-test (t(58) = −2.42, p = .018) (see Table 1).
Table 1.
Two-Tailed Paired Samples t-Test for the Difference Between Course_grades_fall_21 and Course_grades_sp_22.
Course grades | Course grades | |||||
---|---|---|---|---|---|---|
Fall 2021 (n = 57) | Spring 2022 (n = 59) | |||||
M | SD | M | SD | t | p | d |
86.73 | 4.31 | 88.93 | 4.56 | −2.42 | .018 | 0.32 |
Note. Degrees of freedom for the t-statistic = 58. d represents Cohen's d.
Assignment Grade Correlations Spring 2022
A Pearson correlation analysis was conducted among learning outcomes listed below in Table 2. Cohen's standard was used to evaluate the strength of the relationships, where coefficients between 0.10 and 0.29 represent a small effect size, coefficients between 0.30 and 0.49 represent a moderate effect size, and coefficients above 0.50 indicate a large effect size (Cohen, 1988). The result of the correlations was examined using the Holm correction to adjust for multiple comparisons based on an alpha value of 0.05.
Table 2.
Pearson Correlation Results Among PCQM, SCQM, DKM, ExamM, CourseM, and ATIM.
Combination | r | 95.00% CI | n | p |
---|---|---|---|---|
PCQM-SCQM | −.14 | [−0.38, 0.12] | 59 | .860 |
PCQM-ExamM | .47 | [0.24, 0.64] | 59 | .002 |
PCQM-CourseM | .66 | [0.48, 0.78] | 59 | < .001 |
PCQM-ATIM | .27 | [0.01, 0.49] | 59 | .246 |
SCQM-DKM | .20 | [−0.06, 0.43] | 59 | .642 |
SCQM-ExamM | −.28 | [−0.50, −0.03] | 59 | .246 |
SCQM-CourseM | −.28 | [−0.50, −0.02] | 59 | .246 |
SCQM-ATIM | .12 | [−0.14, 0.37] | 59 | .860 |
DKM-ExamM | .29 | [0.04, 0.51] | 59 | .210 |
DKM-CourseM | .46 | [0.24, 0.64] | 59 | .002 |
DKM-ATIM | .37 | [0.13, 0.57] | 59 | .036 |
ExamM-CourseM | .95 | [0.91, 0.97] | 59 | < .001 |
ExamM-ATIM | .09 | [−0.17, 0.34] | 59 | .860 |
CourseM-ATIM | .20 | [−0.06, 0.43] | 59 | .642 |
Note: Data from all 59 students were included in this analysis, but not all students contributed equally to the SCQM data as this was only generated for students when they had a low PCQ score. This is why the DK metric was created.
PCQM = Post-Class Quiz Mean; SCQM = Second Chance Quiz Mean; ExamM = Exam Mean; CourseM = Course Mean; ATIM = ATI Pharmacology Exam Mean; DKM = Demonstrate Knowledge Mean.
p-values adjusted using the Holm correction. Displayed are Pearson correlation coefficients and 95% confidence intervals for multiple pairwise data category comparisons.
There were significant correlations observed between Post Class Quiz score Means (PCQM) with both the Exam score Means (ExamM) (correlation of .47, (p = .002, 95.00% CI = [0.24, 0.64]) and the numeric Course grade Means (CourseM) (correlation of .66, (p < .001, 95.00% CI = [0.48, 0.78]). This suggests that as Post Class Quiz scores increase, exam scores and course grades tend to increase.
There were significant correlations observed between the Demonstrate Knowledge grade Mean (DKM) with both CourseM (correlation of .46, (p = .002, 95.00% CI = [0.24, 0.64])) and with Assessment Technologies Institute (ATI©) standardized quiz Mean (ATIM) (correlation of .37, (p = .036, 95.00% CI = [0.13, 0.57]). This suggests that as Demonstrate Knowledge grades increase, both mean course grades and ATI© quiz scores tend to increase. Therefore, using PL to focus on improvement of the Demonstrate Knowledge grades makes sense in the context of the goal of improving final evaluation endpoints.
A significant positive correlation was observed between ExamM and CourseM, with a correlation of .95 (p < .001, 95.00% CI = [0.91, 0.97]). This suggests that as exam scores increase, course scores tend to increase. No other significant correlations were found.
New Analytics Correlations Spring 2022
No correlations were found between Canvas© New Analytics “Weekly Activities” (page views or participation scores) or late or missing assignment measures with any assignment scores (PCQM, SCQM, DKAM, ExamM, CourseM, and ATIM). This suggests that none of those measures are a good reflection of student performance in this context.
Self-Report of Materials Used by Students for Studying
Every Second Chance Quiz contained questions about what resources students used to study. This quiz was only available to those who scored less than 90% on the Post Class Quiz. On average, approximately 30% of the class took the Second Chance Quiz (high of 85% and a low of 26.67%). All students took at least one Second Chance Quiz. The most common resource students reported they used to study for the quiz was class notes, averaging 96.63% of quiz takers. The least commonly reported resources used were another peer's notes (10.25% of quiz takers) and their book chapters (15.85% of quiz takers). A moderate amount of students reported viewing the instructor-designed brief videos (38.75%) and roughly half of the students stated they used the content review page content (49.43%) (see Table 3).
Table 3.
Resource Use Question Answers on Second Chance Quiz Takers Using Each Learning Resource.
Class handouts | Video | Peer's notes | Book chapter | Nothing | Content review page | Other | No answer | |
---|---|---|---|---|---|---|---|---|
Quiz 1 | 37 (97%) | 26 (68%) | 4 (11%) | 5 (13%) | 1 (3%) | N/A | 11 (29%) | |
Quiz 2 | 45 (100%) | 23 (51%) | 3 (7%) | 9 (20%) | 0 | 34 (76%) | 12 (27%) | |
Quiz 3 | 51 (96%) | 28 (53%) | 7 (13%) | 12 (23%) | 0 | 28 (53%) | 14 (26%) | |
Quiz 4 | 16 (100%) | 9 (56%) | 2 (13%) | 6 (38%) | 0 | 10 (63%) | 1 (6%) | |
Quiz 5 | 25 (100%) | 8 (32%) | 4 (16%) | 1 (4%) | 0 | 11 (44%) | 2 (8%) | |
Quiz 6 | 30 (97%) | 8 (26%) | 3 (10%) | 4 (13%) | 1 (3%) | 19 (61%) | 4 (13%) | 1 (3%) |
Quiz 7 | 40 (93%) | 3 (7%) | 2 (5%) | 4 (9%) | 0 | 12 (28%) | 11 (26%) | 1 (2%) |
Quiz 8 | 38 (90%) | 7 (17%) | 3 (7%) | 3 (7%) | 2 (5%) | 9 (21%) | 5 (12%) | |
Average% | 96.63% | 38.75% | 10.25% | 15.85% | 1.375% | 49.43% | 18.34% | 0.71% |
Conclusion/Importance to Nursing Profession
We have shown in this pilot study that a course redesign adding learning personalization available in the Canvas© learning platform into a face-to-face pathophysiology & pharmacology nursing course improved course mean grades. This impacts nursing because development of more efficient, comprehensive, and targeted teaching methods will improve remediation of specific areas of student knowledge deficit. Encouraging nursing students to engage in independent learning is a crucial aspect of their education. This involves employing effective teaching strategies that enhance students’ capacity to learn and acquire the necessary competencies and skills to engage in competent nursing practice. Such strategies should also foster the ability of nursing students to independently seek out and apply evidence-based practices. Going forward, greater individual fine tuning of content provided to students is needed. This will result in wider data sets that will hopefully be even more specifically predictive of student success.
Footnotes
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.
ORCID iD: Julie F. Hinkle https://orcid.org/0000-0001-9897-410X
References
- Clark R. M., Kaw A. (2020). Adaptive learning in a numerical methods course for engineers: Evaluation in blended and flipped classrooms [Article]. Computer Applications in Engineering Education, 28(1), 62–79. 10.1002/cae.22175 [DOI] [Google Scholar]
- Cohen J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Routledge. 10.4324/9780203771587 [DOI] [Google Scholar]
- Corris R. (2020). Personalizing learning in canvas. Personalizing Learning in Canvas. https://community.canvaslms.com/t5/K12-Canvas-Users/Personalizing-Learning-in-Canvas-1-pdf/ba-p/262532
- Cronbach L. J. (1957). The two disciplines of scientific psychology. American Psychologist, 12(11), 671–684. 10.1037/h0043943 [DOI] [Google Scholar]
- Fariani R. I., Junus K., Santoso H. B. (2022). A systematic literature review on personalised learning in the higher education context [Review; Early Access]. Technology, Knowledge and Learning, 28(2), 449–476. 10.1007/s10758-022-09628-4 [DOI] [Google Scholar]
- Geng Y., Huang P. S., Huang Y. M. (2021). Crowdsourcing in nursing education: A possibility of creating a personalized online learning environment for student nurses in the post-COVID era [Article]. Sustainability, 13(6), 18, Article 3413. 10.3390/su13063413 [DOI] [Google Scholar]
- Heng L. E., Voon W. P., Jalil N. A., Kwun C. L., Chieh T. C., Subri N. F. (2021). Personalization of learning content in learning management system. 2021 10th International Conference on Software and Computer Applications, Kuala Lumpur, Malaysia. https://doi-org.liblink.uncw.edu/10.1145/3457784.3457819
- Hinkle J. F., Jones C. A., Saccomano S. (2020). Pilot of an adaptive learning platform in a graduate nursing education pathophysiology course [Article]. Journal of Nursing Education, 59(6), 327–330. 10.3928/01484834-20200520-05 [DOI] [PubMed] [Google Scholar]
- Kellman P. J., Krasne S. (2018). Accelerating expertise: Perceptual and adaptive learning technology in medical learning. Medical Teacher, 40(8), 797–802. 10.1080/0142159x.2018.1484897 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kraleva R., Sabani M., Kralev V. (2019). An analysis of some learning management systems. International Journal on Advanced Science, Engineering and Information Technology, 9(4), 1190. 10.18517/ijaseit.9.4.9437 [DOI] [Google Scholar]
- Liu M., McKelroy E., Corliss S. B., Carrigan J. (2017). Investigating the effect of an adaptive learning intervention on students’ learning. Educational Technology, Research and Development, 65(6), 1605–1625. 10.1007/s11423-017-9542-1 [DOI] [Google Scholar]
- Mcmillan E. H. (2021). Canvas community new-analytics. Retrieved March 8, 2023, from https://community.canvaslms.com/t5/New-Analytics-Users/Analytics-Page-Views-and-Participations/ta-p/262828
- Mikic V., Ilic M., Kopanja L., Vesin B. (2022). Personalisation methods in e-learning-A literature review [Review]. Computer Applications in Engineering Education, 30(6), 1931–1958. 10.1002/cae.22566 [DOI] [Google Scholar]
- Paradiso J., Chen B. (2021). Personalized learning design with canvas masterypaths. In deNoyelles A. A., Bauer S., Wyatt S. (Eds.), Teaching online pedagogical repository. University of Central Florida Center for Distributed Learning. https://topr.online.ucf.edu/personalized-learning-design-with-canvas-masterypaths/ [Google Scholar]
- Presti C. R., Sanko J. S. (2019). Adaptive quizzing improves end-of-program exit examination scores. Nurse Educator, 44(3), 151–153. 10.1097/nne.0000000000000566 [DOI] [PubMed] [Google Scholar]
- U.S. Department of Education, O. o. E. T (2017). Reimagining the role of technology in education: 2017 national education technology plan update .
- Van Schoors R., Elen J., Raes A., Vanbecelaere S., Depaepe F. (2022). The charm or chasm of digital personalized learning in education: Teachers’ reported use, perceptions and expectations. Techtrends, 67(2), 315–330. 10.1007/s11528-022-00802-0 [DOI] [PMC free article] [PubMed] [Google Scholar]