Skip to main content
Journal of Graduate Medical Education logoLink to Journal of Graduate Medical Education
. 2020 Jun;12(3):303–311. doi: 10.4300/JGME-D-19-00493.1

Next Steps in the Implementation of Learning Analytics in Medical Education: Consensus From an International Cohort of Medical Educators

Brent Thoma 1,, Eric Warm 1, Stanley J Hamstra 1, Rodrigo Cavalcanti 1, Martin Pusic 1, Tim Shaw 1, Amol Verma 1, Jason R Frank 1, Karen E Hauer 1
PMCID: PMC7301933  PMID: 32595850

Abstract

Background

With the implementation of competency-based assessment systems, education programs are collecting increasing amounts of data about medical learners. However, learning analytics are rarely employed to use this data to improve medical education.

Objective

We identified outstanding issues that are limiting the effective adoption of learning analytics in medical education.

Methods

Participants at an international summit on learning analytics in medical education generated key questions that need to be addressed to move the field forward. Small groups formulated questions related to data stewardship, learner perspectives, and program perspectives. Three investigators conducted an inductive qualitative content analysis on the participant questions, coding the data by consensus and organizing it into themes. One investigator used the themes to formulate representative questions that were refined by the other investigators.

Results

Sixty-seven participants from 6 countries submitted 195 questions. From them, we identified 3 major themes: implementation challenges (related to changing current practices to collect data and utilize learning analytics); data (related to data collection, security, governance, access, and analysis); and outcomes (related to the use of learning analytics for assessing learners and faculty as well as evaluating programs and systems). We present the representative questions and their implications.

Conclusions

Our analysis highlights themes regarding implementation, data management, and outcomes related to the use of learning analytics in medical education. These results can be used as a framework to guide stakeholder education, research, and policy development that delineates the benefits and challenges of using learning analytics in medical education.


What was known and gap

Medical education programs are collecting increasing amounts of data about medical learners because of the implementation of competency-based assessment systems. These data have the potential to provide a more holistic view of each learner's progress than traditional assessments, but learning analytics are rarely employed to use this data.

What is new

A summit on learning analytics that engaged participants in discussions about questions that need to be addressed to move the field forward.

Limitations

Participants were a self-selected group, and were more likely to be engaged and interested in learning analytics than other medical educators.

Bottom line

Issues related to implementation, data management, and outcomes may limit the adoption of learning analytics in medical education.

Introduction

The availability of increasing amounts of data about learners and their performance presents new challenges and opportunities in medical education around the world.1 In North America, the transition toward competency-based medical education (CBME) by the Accreditation Council for Graduate Medical Education (ACGME) Milestones and the Royal College of Physicians and Surgeons of Canada's Competence by Design is leading to an increase in the amount and diversity of trainee performance data.25 A programmatic approach to assessment in CBME supports the use of this assessment data of, and for, learning.6 These data have the potential to provide a more holistic view of each learner's progress than traditional assessments, facilitate individualized teaching, coaching, and assessment, inform remediation planning, predict future performance, establish learning trajectories for various competencies, contribute to faculty development and program evaluation, and potentially tie educational assessments to patient care outcomes.7 However, sophisticated techniques to analyze and display data for these purposes have not been widely applied in medical education to date.8

The field of analytics involves the collection and analysis of data, often through statistical modeling, to develop actionable insights.9 Analytics support a variety of decision-making activities across many fields, including business and sports.8,10 The use of learning analytics, a subtype of analytics that interprets educational data to describe, characterize, support, and predict the behaviors of learners in higher education, has recently proliferated.8,10 A lack of awareness and application of learning analytics methodologies have been cited as barriers preventing the widespread use of these techniques in medical education.7

We sought to characterize barriers to the use of learning analytics techniques in medical education by identifying the questions of educators interested in this field. We anticipate that the elucidation of these questions will better characterize current gaps in knowledge and policy that need to be addressed to potentiate the effective use of learning analytics.

Methods

The Summit on Learning Analytics in Medical Education (the Summit), hosted during the 2017 Royal College of Physicians and Surgeons of Canada's International Conference on Residency Education (ICRE), served as a venue to explore this study question with an international sample of interested educators.

Participants and Setting

Attendees of the Summit at ICRE were recruited to participate. The Summit was a 2-day (October 18–19, 2017) preconference event with required preregistration, organized by the authors of this article and supported by the Royal College of Physicians and Surgeons of Canada. The target audience was medical educators and medical education scholars with an interest in learning analytics in medical education. Workshop sessions were presented on the use of learning analytics at various learner and program levels, potential sources of data for learning analytics, challenges in using learning analytics to facilitate learning, promises and pitfalls in the use of learning analytics, the use of clinical outcomes for assessing learning performance, the use of dashboards to provide formative and summative feedback, privacy and data legacy issues, and key questions in learning analytics.

Data Collection

Data collection occurred in the afternoon of the second day of the Summit during a 1-hour session focused on key questions in learning analytics that happened just prior to the summary and closing events. The research purpose of this session was clearly explained to the participants, and consent was presumed based on their voluntary participation.

The operational definition of learning analytics was defined at the Summit as the interpretation of educational data to describe, characterize, support, and predict the behaviors of learners.8,10 Prior to the Summit, we identified 3 broad dimensions related to learning analytics (learner perspectives, program perspectives, and data stewardship) through a review of key literature7,8,11,12 and discussion. At the beginning of the Summit session, one author (B.T.) described each dimension to all participants using the discussion prompts outlined in Table 1. We then requested that the Summit participants divide themselves into 3 groups of relatively equal size to discuss these topics in relation to learning analytics. Paper copies of the topic statement and guiding questions were available to each group.

Table 1.

Discussion Prompts Used to Facilitate Conversation During Key Questions in Learning Analytics Session

Topic Discussion Statement Prompting Questions
Learner Perspectives Learning analytics are intended to help learners. What don't you know about how to use learning analytics? What do you need to know more about to use learning analytics effectively? What questions surrounding learning analytics need further study?
Program Perspectives Educators, tutors, coaches, and program directors will use learning analytics to guide their teaching and assessment. What don't you know about how to use learning analytics? What do you need to know more about to use learning analytics effectively? What questions surrounding learning analytics need further study?
Data Stewardship There are concerns about data stewardship in terms of privacy, legacy, and access. What don't you know about data stewardship for learning analytics? What do you need to know about data stewardship to use learning analytics effectively? What questions surrounding the data stewardship of learning analytics need further study?

Each session lasted approximately 15 minutes and was repeated a second time (6 discussions in total). Participants self-selected which 2 of the 3 discussion groups to attend based on their interests. The goals of the small group sessions were to (1) facilitate conversation surrounding key questions related to the topic and the use of learning analytics in medical education, and (2) capture these key questions for further analysis. Three discussion groups were facilitated by faculty with expertise on the topics: learner perspectives (K.E.H.), program perspectives (S.H.), and data stewardship (B.T.).

The format of the small group discussions was designed to optimize the number of questions submitted and was based on the first 4 steps of the nominal group technique (introduction, silent generation of ideas, sharing ideas, and group discussion).13,14 To ensure that participants' initial thoughts were captured, the facilitators instructed them to submit key questions related to their topic as soon as they arrived in the room. Participants then engaged in small group discussion until they were paused by their facilitator and given time to submit additional questions. Participants submitted questions anonymously from their devices using Poll Everywhere (San Francisco, CA) audience response software. They could choose not to participate in the question generation exercise, and submissions were not tracked by the submitter.

This research received an exemption from the Institutional Review Board of the American Institutes for Research.

Qualitative Analysis

Following the Summit, we collated the submitted participant questions on a Microsoft Excel spreadsheet and conducted an inductive content analysis.15 When multiple participant questions were entered as part of a single submission, we separated them into individual questions.

Three investigators (B.T., E.W., K.E.H.) independently reviewed all questions, generated potential codes, and met 4 times over 5 months to discuss and combine key ideas into a codebook through consensus. Once the codebook framework was established, one author (B.T.) recoded all data while 2 others (E.W., K.E.H.) each coded half of the data independently. The 3 reviewers then met again to discuss discrepancies; when possible, these discrepancies were resolved through consensus of the disagreeing raters. When there was disagreement between the 2 coding investigators, the third (E.W. or K.E.H.) adjudicated. When necessary, questions were classified under 2 codes. These 3 investigators reviewed the coded data to group the findings into larger themes and subthemes. Finally, one author (B.T.) drafted representative questions within each subtheme with the intention of accurately representing the essential constructs contained within the raw data. These questions were reviewed and revised by the 2 other investigators (E.W. and K.E.H.) to ensure that no important ideas were missed. The full authorship team, all of whom participated in the Summit, then reviewed results for clarity, cohesiveness, and completeness.

The 3 reviewers considered reflexivity in their work. They included the program director of a large internal medicine residency program with a background in quality improvement (E.W.), the dean of assessment of a medical school (K.E.H.), and a junior faculty member with training in medical education (B.T.). Throughout the coding process, they challenged one another regarding the applicability of the questions from the dataset in different settings and discussed their own experiences and perspectives.

Results

Participants

A total of 67 participants (39 male, 28 female) attended the ICRE Learning Analytics Summit from 6 countries, including Canada (43), the United States (14), and Australia/Europe (10). The primary affiliation was listed as an academic institution (eg, a university) for 39, a national medical organization (eg, Royal College of Physicians and Surgeons of Canada) for 17, a clinical institution (eg, a hospital) for 7, and a corporation for one. Three participants did not provide an affiliation. Most of the Summit participants attended the “Key Questions in Learning Analytics” session, but exact attendance was not recorded.

Question Submission

During the 6 small group sessions, participants submitted 195 questions. Three questions were deleted because they lacked sufficient information to code. The final dataset comprised 192 participant questions: 72 from the learner perspectives groups, 76 from the program perspectives groups, and 44 from the data stewardship groups.

Qualitative Content Analysis

Table 2 outlines the themes resulting from the content analysis with representative questions listed under each theme. The analysis revealed 3 main themes (implementation challenges, data, and outcomes) and 18 subthemes. During coding, 25 items received 2 codes, and 9 items were coded as a theme without a subtheme: 1 as implementation challenges, 3 as data, and 5 as outcomes. Minor modifications to the themes, subthemes, and representative questions were a result of feedback from the authorship team, but no new questions or codes were added following their review.

Table 2.

Themes, Subthemes, and Representative Questions From Content Analysis of Participant Questions

Theme 1: Implementation Challenges (92 items)
Subtheme (No. of Items Submitted) Representative Questions
Learner support/development (56)
  • What are learners' expectations for learning analytics?

  • How can learning analytics be used to provide feedback?

  • What do learners need to make sense of learning analytics data?

  • Who will prepare learners to make use of learning analytics data?

  • What are the best practices for leveraging coaching along with learning analytics?

Change process (16)
  • How can we get learner buy-in for the use of learning analytics?

  • What are the change management implications for the adoption of learning analytics?

  • Can learning analytics be used by smaller programs?

  • How should learning analytics be used to build trust with stakeholders?

  • What role should learners play in the change process?

  • How can the needs and desires of all stakeholders be balanced?

Faculty development (14)
  • What faculty development programs will need to be provided to ensure that learning analytics are used effectively?

  • Who within programs needs to be fluent with the use of learning analytics?

  • How can learning analytics data be used to support faculty development?

  • What are the best practices for faculty coaching learners to use learning analytics data?

Resources (5)
  • Where should resources be allocated to develop a system of learning analytics?

  • What technical skills are required to effectively utilize learning analytics?

Theme 2: Data (94 items)
Subtheme (No. of Items Submitted) Representative Questions
Security (26)
  • What security precautions should be taken to protect learning analytics data?

  • Do security precautions for aggregate or deidentified learning analytics data differ from identifiable data?

  • Where should learning analytics data be stored?

  • How can innovation be facilitated while ensuring the security of learning analytics data?

  • Can aggregate or deidentified data be shared?

  • Can data ever really be deidentified?

  • How can individual privacy be balanced with the social benefit of sharing data?

  • What learning analytics data can be shared with external institutions?

  • How can learners be reassured that their data will be secure?

Governance (16)
  • Who owns learning analytics data?

  • What laws apply to the governance of learning analytics data?

  • How should learners be consulted regarding the use of their learning analytics data?

  • How will access to data for other purposes (eg, research) be overseen?

  • What are best practices for data stewardship with educational data?

  • How long does data need to be maintained?

Analysis (14)
  • What are best practices for collating and analyzing data?

  • How will we determine what information is critical and what information is noise?

  • How should learning analytics data be translated into actionable knowledge?

  • Can analytic systems be developed that predict a learner's need for intervention?

  • How can learning analytics data be used to start deeper conversations?

  • How can qualitative data be used along with quantitative learning analytics data?

Access (13)
  • What norm-referenced data should be shared with learners?

  • Should learners have access to all information collected about them?

  • Who should have access to a learner's data?

  • Who should control access to a learner's data?

Validity (9)
  • How do we ensure that learning analytics data represent the desired constructs?

  • How do we ensure that learning analytics data are being used to answer the correct questions?

  • What factors determine the internal and external validity of learning analytics?

  • How can decisions be reached using learning analytics be validated?

  • How can we prevent “gaming” of a learning analytics assessment system?

Presentation/visualization/ format (7)
  • What should a user interface look like for reviewing learning analytics data?

  • How can learning analytics data be presented to learners in a way that promotes metacognition?

  • How will learning analytics data need to be presented differently to various groups (eg, learners, educators, coaches, program directors)?

  • What are best practices in data presentation?

Collection (6)
  • What kind of data should be collected?

  • From what sources should learning analytics data be collected?

Theme 3: Outcomes (59 items)
Subtheme (No. of Items Submitted) Representative Questions
Learner assessment (23)
  • How will learners benefit from learning analytics?

  • How can learning analytics data be used to detect struggling learners early?

  • How many successful observations are required to determine that a learner is competent in an entrustable professional activity?

  • What are the best practices for the use of learning analytics by competency committees to inform promotion decisions?

  • What learning analytics data will facilitate learning?

Purpose/impact (12)
  • How will learning analytics add value to education?

  • What is the purpose(s) of learning analytics?

  • How will learning analytics help learners?

Program evaluation (7)
  • How can learning analytics be used for quality improvement within a program?

  • Should standards regarding learning analytics be incorporated into the program accreditation process?

  • How can the impact of learning analytics on a program be evaluated?

Faculty assessment (4)
  • How can trainee feedback on learning analytics data be used to assess faculty?

Systems evaluation (4)
  • How will learner data be used to assess institutional performance?

  • How will learning analytics impact patient outcomes?

  • How can we link learning analytics with patient process and outcomes data?

Additional consequences (4)
  • How will the use of learning analytics impact patient care?

  • How will the use of learning analytics impact trainee interactions with patients, staff, and supervisors?

  • How will the use of learning analytics impact supervisor evaluations?

Theme 1–Implementation Challenges:

This theme was featured prominently in the analysis, suggesting that implementation challenges may be as much of a barrier to the use of learning analytics as their technical challenges. The subthemes indicate that the use of learning analytics is also a complex change management problem. Participants raised the issues of gaining buy-in, building trust, providing support, and overcoming resistance.

Theme 2–Data:

This theme raised questions regarding all aspects of learner data, including its collection, security, governance, and access. Subthemes addressed the analytic techniques used, how the data should be presented to various parties, and the validity of evidence for using learning analytics to make decisions regarding trainees. These challenges highlight the technical capabilities and data policies that must be addressed at individual sites.

Theme 3–Outcomes:

This theme outlined issues regarding how learning analytics could quantify the outcomes of learners and faculty as well as the evaluation of programs and systems. Subthemes regarding the purpose of learning analytics and its potential effects on educational and clinical outcomes suggest that work is needed to better define the objectives of learning analytics, which may help inform why they are, or are not, being used.

Discussion

This qualitative analysis of data collected from an international cohort of medical educators identified 3 major themes pertaining to the application of learning analytics in medical education. The scope of the subthemes and numerous representative questions underscores the confusion regarding learning analytics that existed in an interested group of stakeholders, even at the end of a Summit on this topic. Addressing these issues is likely necessary for learning analytics to be appropriately and effectively implemented in medical education.

Across the 3 identified themes, there was a strong focus on needing to better understand how learning analytics might impact learning and learners. Within higher education, a learning analytics cycle has been described that underscores the importance of “closing the loop” with learners and ensuring that the data collected about them is fed back to them.16 This process presents data as actionable analytics for the purpose of more rapid and efficient educational intervention.17 It could include displaying their learning data (relative to peers, historical cohorts, or predetermined performance standards), having faculty initiate personal contact when these data indicate that learners are struggling, or even demonstrating how the information collected resulted in changes for future cohorts.16,18 Subthemes in this study that related to how learning analytics will change assessment are similar to calls for transparency in the use of assessment data appearing in the higher education literature.12,16 Overall, our results suggest that further study of the impact of learning analytics on medical learners is needed and underscore the need for learner involvement in the local development and adoption of learning analytics initiatives.

Implementation challenges were identified as a primary theme. Beyond the need for supporting learners and faculty to adapt to an environment in which more data are generated about and for them, the change process itself was a concern. This recognition is prescient because the effective use of learning analytics can alter how data are used and analyzed by the organization, faculty members, learners, and programs in undetermined ways. For implementation to be successful, significant work must be undertaken to engage the stakeholders, communicate effectively, provide accessible ways to use the data, and integrate the implementation in ways that are closely tied to institutional priorities.19

After data collection begins, it will be important to determine how change management processes and implementation plans can be used to optimize the use of learning analytics. Within the education literature, the RAPID Outcome Mapping Approach (ROMA) is a promising example.19 Developed for complex institutional contexts such as health care, ROMA serves as a holistic 7-step framework to understand and develop strategies for a challenge like learning analytics implementation.19 Given the prominence of implementation challenges within our analysis, local use of learning analytics will likely benefit from consideration of the literature addressing leadership and implementation science to overcome these barriers.2022

Themes concerning the security and governance of learner data were more numerous than those related to its analysis, highlighting the prominence of these concerns within this group of stakeholders.7 While the digitization of information has made learning analytics possible, it also introduces the possibility of data breaches, which have the potential to expose data and create significant consequences for learners.12,23 Close attention to security and governance is essential to ensure that key stakeholders trust the system with their data.12 Fortunately, these challenges have already been addressed in other fields. The Society for Learning Analytics Research describes various analysis techniques that protect learner data.11 Best practices for the maintenance and protection of patient data have been published and could inform the stewardship of learner data.24,25 An important question within the medical education sphere is who should have access to various types of learner data. Program directors, rotation supervisors, competence committees, academic advisers, and other educational leaders may require access to some aspects of learner data, but local policies are needed to determine who should have access to what type of data, when, and for what purpose. Clearly and transparently addressing security and governance concerns is likely to facilitate the implementation of learning analytics at the local level.

The outcomes theme focused on assessment and evaluation and suggested that learning analytics can facilitate programmatic assessment and support learners.16,26 Whereas traditional assessment occurred within discrete courses or clinical rotations, learning analytics make it possible to perform sophisticated analyses of a learner's developmental trajectory over time and across programs.2729 In addition, programs can use learning analytics data predictively to identify risks for poor performance and intervene to change a learner's trajectory.7,30,31 Learner assessment and program evaluation in medical education draw on models and theories that are well described in the literature.3234 However, the use of more sophisticated techniques for the analysis and visualization of learner data for assessment and evaluation are still relatively novel11,18,35 and can support the use of these theories in practice in a targeted way. From the individual learner perspective, learning analytics offers insights that contrast effective and ineffective learning behaviors.10 The opportunity to consider learning analytics in the context of learning sciences literature suggests opportunities to promote adaptive approaches to learning that support conceptual understanding and long-term retention.36 With local learning analytics implementation, it will be important to clearly articulate the purpose and specific outcomes desired while also studying their implementation to demonstrate that these goals are being achieved without unanticipated negative consequences.

This study has limitations. Participants were a self-selected group, and were more likely to be engaged and interested in learning analytics than other medical educators. While this is advantageous in identifying important questions, some important subgroups such as learners may have different perspectives and were underrepresented in the group. We cannot confirm how many of the Summit participants submitted questions or how many questions were submitted by each attendee. Although the question submission process was anonymous, it is possible that participants had additional questions that they chose not to submit. More questions may have been submitted had all participants been able to attend all 3 sessions; however, time was not available for a third small group discussion. The positioning of the session at the end of the Summit ensured that the participants had recent exposure to current issues in medical education learning analytics, but the information presented could have influenced their perspectives on the topic. Data were collected in October 2017; thus, it is possible that educators' sentiments have changed. Finally, we did not have a mechanism to prioritize the questions or verify our results with the participants following the conference.

Conclusions

Our analysis determined that issues related to implementation, data management, and outcomes may limit the adoption of learning analytics in medical education. These results may provide educators with a framework to address these critical issues broadly and in their own context through stakeholder education, research, and policy development.

Footnotes

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

The authors would like to thank the Royal College of Physicians and Surgeons of Canada for hosting the Learning Analytics Summit at the 2017 International Conference on Residency Education.

References

  • 1.Association of Faculties of Medicine in Canada. The Future of Medical Education in Canada (FMEC): A Collective Vision for MD Education. 2020 https://cou.ca/wp-content/uploads/2010/01/COU-Future-of-Medical-Education-in-Canada-A-Collective-Vision.pdf Accessed April 20.
  • 2.Choe JH, Knight CL, Stiling R, Corning K, Lock K, Steinberg KP. Shortening the miles to the milestones: connecting EPA-based evaluations to ACGME milestone reports for internal medicine residency programs. Acad Med. 2016;91(7):943–950. doi: 10.1097/ACM.0000000000001161. [DOI] [PubMed] [Google Scholar]
  • 3.Englander R, Frank JR, Carraccio C, Sherbino J, Ross S, Snell L. Toward a shared language for competency-based medical education. Med Teach. 2017;39(6):582–587. doi: 10.1080/0142159X.2017.1315066. [DOI] [PubMed] [Google Scholar]
  • 4.Smirnova A, Sebok-Syer SS, Chahine S, Kalet AL, Tamblyn R, Lombarts KMJMH, et al. Defining and adopting clinical performance measures in graduate medical education: where are we now and where are we going? Acad Med. 2019;94(5):671–677. doi: 10.1097/ACM.0000000000002620. [DOI] [PubMed] [Google Scholar]
  • 5.Accreditation Council for Garduate Medical Education. Milestones National Report. 20182020 https://www.acgme.org/What-We-Do/Accreditation/Milestones/Resources Accessed April 20.
  • 6.van der Vleuten CPM, Schuwirth LWT, Driessen EW, Dijkstra J, Tigelaar D, Baartman LK, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205–214. doi: 10.3109/0142159X.2012.652239. [DOI] [PubMed] [Google Scholar]
  • 7.Ellaway RH, Pusic MV, Galbraith RM, Cameron T. Developing the role of big data and analytics in health professional education. Med Teach. 2014;36(3):216–222. doi: 10.3109/0142159X.2014.874553. [DOI] [PubMed] [Google Scholar]
  • 8.Chan T, Sebok-Syer S, Thoma B, Wise A, Sherbino J, Pusic M. Learning analytics in medical education assessment: the past, the present, and the future. AEM Educ Train. 2018;2(2):178–187. doi: 10.1002/aet2.10087. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Cooper A. What is analytics? Definition and essential characteristics. CETIS Anal Ser. 2012;1(5):1–10. [Google Scholar]
  • 10.Van Barneveld A, Arnold KE, Campbell JP. Analytics in higher education: establishing a common language. Educ Learn Initiat. 2012;1:1–11. doi: 10.1111/j.1468-2273.2009.00438.x. [DOI] [Google Scholar]
  • 11.Lang C, Siemens G, Wise A, Gasevic D. Handbook of Learning Analytics: First Edition. Society for Learning Analytics Research; 2017. [Google Scholar]
  • 12.Pardo A, Siemens G. Ethical and privacy principles for learning analytics. Br J Educ Technol. 2014;45(3):438–450. doi: 10.1111/bjet.12152. [DOI] [Google Scholar]
  • 13.Potter MSGPH. The Nominal Group Technique: a useful consensus methodology in physiotherapy research. New Zeal J Physiother. 2004;32(3):126–130. [Google Scholar]
  • 14.Humphrey-Murto S, Varpio L, Gonsalves C, Wood TJ. Using consensus group methods such as Delphi and Nominal Group in medical education research. Med Teach. 2017;39(1):14–19. doi: 10.1080/0142159X.2017.1245856. [DOI] [PubMed] [Google Scholar]
  • 15.Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–115. doi: 10.1111/j.1365-2648.2007.04569.x. [DOI] [PubMed] [Google Scholar]
  • 16.Clow D. The learning analytics cycle: closing the loop effectively. The Open University. 2020 http://oro.open.ac.uk/34330/1/LAK12-DougClow-personalcopy.pdf Accessed April 20.
  • 17.Elias T. Learning Analytics: Definitions, Processes and Potential. 20112020 https://pdfs.semanticscholar.org/732e/452659685fe3950b0e515a28ce89d9c5592a.pdf Accessed April 20.
  • 18.Boscardin C, Fergus KB, Hellevig B, Hauer KE. Twelve tips to promote successful development of a learner performance dashboard within a medical education program. Med Teach. 2018;40(8):855–861. doi: 10.1080/0142159X.2017.1396306. [DOI] [PubMed] [Google Scholar]
  • 19.Ferguson R, Macfadyen LP, Clow D, Tynan B, Alexander S, Dawson S. Setting learning analytics in context: overcoming the barriers to large-scale adoption. J Learn Anal. 2014;1(3):120–144. doi: 10.18608/jla.2014.13.7. [DOI] [Google Scholar]
  • 20.Avolio BJ, Walumbwa FO, Weber TJ. Leadership: current theories, research, and future directions. Annu Rev Psychol. 2009;60(1):421–449. doi: 10.1146/annurev.psych.60.110707.163621. [DOI] [PubMed] [Google Scholar]
  • 21.Kotter J. Leading Change. Boston, MA: Harvard Business School Press; 1996. [Google Scholar]
  • 22.Fisher ES, Shortell SM, Savitz LA. Implementation science: a potential catalyst for delivery system reform. JAMA. 2016;315(4):339–340. doi: 10.1001/jama.2015.17949. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Prinsloo P, Slade S. The Open University; 2020. Student privacy self-management: implications for learning analytics. http://oro.open.ac.uk/42395/ Accessed April 20. [Google Scholar]
  • 24.Rosenbaum S. Data governance and stewardship: designing data stewardship entities and advancing data access. Health Serv Res. 2010;45(5 pt 2):1442–1455. doi: 10.1111/j.1475-6773.2010.01140.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Cavoukian A. Privacy by Design: The 7 Foundational Principles. 2020 https://iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf Accessed April 20.
  • 26.Van Der Vleuten CPM, Schuwirth LWT, Driessen EW, Govaerts MJB. Twelve tips for programmatic assessment. Med Teach. 2015;37(7):641–646. doi: 10.3109/0142159X.2014.973388. [DOI] [PubMed] [Google Scholar]
  • 27.Van Loon KA, Driessen EW, Teunissen PW, Scheele F. Experiences with EPAs, potential benefits and pitfalls. Med Teach. 2014;36(8):698–702. doi: 10.3109/0142159X.2014.909588. [DOI] [PubMed] [Google Scholar]
  • 28.Sztajn P, Confrey J, Wilson PH, Edgington C. Learning trajectory based instruction: toward a theory of teaching. Educ Res. 2012;41(5):147–156. doi: 10.3102/0013189X12442801. [DOI] [Google Scholar]
  • 29.Thoma B, Bandi V, Carey R, Mondal D, Woods R, Martin L, et al. Developing a dashboard to meet competence committee needs: a design-based research project. Can Med Educ J. 2020;11(1):e16–e34. doi: 10.36834/cmej.68903. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Green ML, Aagaard EM, Caverzagie KJ, Chick DA, Holmboe E, Kane G, et al. Charting the road to competence: developmental milestones for internal medicine residency training. J Grad Med Educ. 2009;1(1):5–20. doi: 10.4300/01.01.0003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Holmboe ES, Yamazaki K, Nasca TJ, Hamstra SJ. Using longitudinal milestones data and learning analytics to facilitate the professional development of residents. Acad Med. 2019;95(1):97–103. doi: 10.1097/acm.0000000000002899. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Pangaro L, ten Cate O. Frameworks for learner assessment in medicine: AMEE Guide No. 78. Med Teach. 2013;35(6):e1197–e1210. doi: 10.3109/0142159X.2013.788789. [DOI] [PubMed] [Google Scholar]
  • 33.Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE Guide No. 67. Med Teach. 2012;34(5):288–299. doi: 10.3109/0142159X.2012.668637. [DOI] [PubMed] [Google Scholar]
  • 34.Cook DA. Twelve tips for evaluating educational programs. Med Teach. 2010;32(4):296–301. doi: 10.3109/01421590903480121. [DOI] [PubMed] [Google Scholar]
  • 35.Chen C, Wolfgang KH, Unwin A. Handbook of Data Visualization. Berlin, Germany: Springer Science & Business Media; 2008. [Google Scholar]
  • 36.Ferguson R. Learning analytics: drivers, developments and challenges. Int J Technol Enhanc Learn. 2012;4(5–6):304–317. doi: 10.1504/IJTEL.2012.051816. [DOI] [Google Scholar]

Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education

RESOURCES