Skip to main content
Heliyon logoLink to Heliyon
. 2024 Feb 9;10(4):e25767. doi: 10.1016/j.heliyon.2024.e25767

A review of learning analytics opportunities and challenges for K-12 education

Catherine Paolucci a,, Sam Vancini a, Richard T Bex II a, Catherine Cavanaugh b, Christine Salama b, Zandra de Araujo b
PMCID: PMC10881331  PMID: 38390101

Abstract

Use of learning analytics to improve educational outcomes is a growing area of research. While learning analytics research has been more prevalent in higher education contexts, this study presents findings from a qualitative metasynthesis of 47 research publications related to opportunities and challenges relevant to learning analytics design, implementation, and research at the PK-12 level. Our findings indicate that, while many see the educational benefits of learning analytics (e.g., more equitable instruction, individualized learning, enhanced assessment for learning); others remain unconvinced by a lack of evidence of improved outcomes and concerned about persistent challenges and potentially harmful impacts (e.g., infringement on users’ privacy, misuse or misinterpretation of data by educators). We conclude by considering implications for mathematics education stemming from our analysis of learning analytics and posing questions to shape future research and key developments in mathematics education as learning analytics become more prevalent.

Keywords: Learning analytics, PK-12 education, Mathematics education, Equity

1. Introduction

Learning analytics includes continually advancing uses of data to understand and improve educational systems, policies, instruction, and learning. Use of varying learning analytics tools and systems enable education stakeholders to collect, analyze, and report data on learning behaviors, contexts, and outcomes [1] with the aim of better understanding and supporting student learning and transforming the way data are used at all levels of education to inform decision making, policy, and practice. While some researchers have established the potential for learning analytics to support more equitable opportunities for all students [2,3,4], others raise concerns about data-empowered accountability practices [5,6].

One area of consensus is that more research is needed to understand the most effective and responsible ways to use learning analytics, particularly in Pre-Kindergarten through grade 12 (PK-12) education. Here, we synthesize and discuss the opportunities and challenges identified across recent literature related to the use of learning analytics to support more effective and equitable education and instructional practices, with a particular focus on PK-12 education.

While higher education institutions were early adopters of learning analytics, PK-12 schools are increasingly using learning management systems and other digital technologies that support learning analytics. Although the technology is still not widely used in PK-12 settings, practical applications are emerging, particularly in dynamic assessment and individualized learning [7]. Many consider learning analytics a promising tool to understand, personalize, and optimize PK-12 learning and inform data-driven systems shaped by school administrators, district leaders, and policymakers [1]. However, careful consideration of both the opportunities and challenges related to issues such as feasibility, usability, bias, and privacy are critical to ensuring fair and equitable use at the PK-12 level [8,5,9,10].

While there have been several reviews of research on the nature and use of learning analytics, these primarily examine research on specific aspects or issues related to learning analytics in higher education, including dashboards, interventions, and research design [11,12,13,14]. They also include related areas of research such as educational data mining [15]. To date, we found no reviews focused on the use of learning analytics to improve PK-12 teaching and equitable access to rich, meaningful learning of the PK-12 curriculum, including critical foundational content (e.g., Algebra) and formative learning experiences for both affective and cognitive development.

Some higher education research has identified the potential for learning analytics to improve success in ways relevant to PK-12 education [16,17]. One example is identifying students at risk of failing or dropping out of mathematics courses [18]. This is a valuable focus for PK-12 mathematics, given that failed courses stall progress on learning concepts that provide a critical foundation for further studies of all STEM disciplines [19]. Another example is improving assessment of student thinking and engagement with problems and tasks rather than focusing on the final answer. Additionally, data mining has been used to understand mathematical behavior and thinking processes, identify factors affecting achievement, and provide automated assessment of written work [15]. These are all likely to offer equally valuable support for PK-12 mathematics teaching and learning.

Given the increased use of learning analytics in PK-12 education, it is important and timely to expand the focus of learning analytics research to explicitly examine the opportunities and challenges of using learning analytics to support effective and equitable PK-12 teaching and learning. This metasynthesis, conducted by a team of PK-12 researchers and practitioners, draws on current literature to identify opportunities and challenges with relevance to PK-12 education. We reviewed a collection of literature with two guiding questions.

  • 1)

    What opportunities do learning analytics offer that may have relevance to promoting more equitable and effective PK-12 teaching and learning?

  • 2)

    What challenges do learning analytics pose that may have relevance to promoting more equitable and effective PK-12 teaching and learning?

Researchers have distinguished between ‘learning analytics’, which aim to improve academic outcomes, and ‘learner analytics’, which focus on student engagement and retention. We interpret learning analytics as having both objectives [20]. With this in mind, our resulting metasynthesis has three unique and important features pertinent to progressing the scholarly discussion of learning analytics. First, it examines both the opportunities and challenges of learning analytics, rather than focusing on one specific aspect. This is important given claims that existing evidence related to implementation of learning analytics is more focused on positive outcomes than potentially negative results [21].

Second, we conducted this metasynthesis with the systems, structures, and practices that shape PK-12 education in mind. As a result, we aim to identify and discuss findings from research across a variety of contexts with relevance for PK-12 education. This will help create a shared understanding of the opportunities and challenges associated with learning analytics and support development of a PK-12 learning analytics research agenda. Finally, our search included a focus on equity to ensure consideration of the opportunities and challenges in current literature around using learning analytics in ethical and inclusive ways.

2. PK-12 focus on equity

When discussing equity in the context of PK-12 education, we include systems, policies, and practices that ensure access to instructional materials, content, and learning experiences that enable all students the opportunity for full participation in high-quality, grade-level appropriate learning activities and achievement of positive outcomes. Equitable systems, policies, and practices explicitly address and eliminate the potential for aspects of students’ identities and demographic characteristics (e.g., race, ethnicity, gender, socioeconomic status) to predict outcomes, including engagement, achievement, and dispositions. This extends to all levels of influence on PK-12 education (e.g., classroom, school, district, state, federal).

Additionally, organizations and policy documents have identified the critical role of mathematics education in their focus on equity, citing the foundational role of mathematical learning for all STEM disciplines and linking successful completion of Algebra 1 by 9th grade with high school and college graduation rates and high-paying careers [22,[23], [24], [25]]. Thus, while there are many aspects of PK-12 education central to equity, our synthesis of the opportunities and challenges of learning analytics relevant to PK-12 education includes a focus on mathematics, both as part of the search process and in our discussion of findings and implications.

3. Methods

We used a qualitative metasynthesis process to analyze, synthesize, and interpret findings in a deliberately selected body of recent learning analytics literature [26]. A metasynthesis aims to identify common themes or compare and contrast findings and perspectives to provide insights relevant to established research questions. It can also be used to identify gaps or areas of need for future research [27]. This made it ideal for our efforts to consider the relevance of existing learning analytics research, primarily conducted in higher education contexts, for multiple dimensions of PK-12 education (e.g., classroom teaching and learning, systemic structures, and policymaking).

We aimed to synthesize findings from emerging research (published since 2015) to identify opportunities and challenges of learning analytics relevant to PK-12 education. Additionally, in synthesizing issues relevant for PK-12 education, we sought to establish a foundation from which to identify future directions for research on learning analytics with a specific focus on equity and relatedly, mathematics education.

Adapting established metasynthesis processes from multiple sources and studies [28,29,27], we progressed through the following six sequential steps.

  • 1.

    Define research questions and relevant selection criteria

  • 2.

    Conduct a comprehensive search of recent and relevant literature

  • 3.

    Use established inclusion-exclusion criteria to select initial relevant studies

  • 4.

    Assess the quality of the resulting literature

  • 5.

    Use qualitative methods to analyze text and synthesize findings

  • 6.

    Present findings across studies to answer research questions

We used content analysis [30], including deductive coding and counting techniques, to synthesize research findings, discussions, and recommendations from the selected literature. Counting the presence of thematic codes across the selected literature enabled us to provide a concise representation of the prevalence of themes related to both opportunities and challenges associated with learning analytics.

3.1. Search, selection, and inclusion-exclusion criteria

Two authors searched for publications with a focus on learning analytics applications in teaching and learning settings. The first stage included a search in Google Scholar for publications since 2015 reporting on learning analytics applications that were relevant to PK-12 education and the secondary mathematics context. The year 2015 was selected to encompass a period of predominant learning analytics development, including cloud data, AI, machine learning, and visualization.

The search terms were “mathematics” OR “equity” AND “learning science” OR “data science” OR “learning analytics” OR “data analytics.” Inclusion of equity within the search reflected the researchers’ commitment to viewing prominent, contemporary PK-12 developments through an equity lens. Similar searches were performed in education and social science databases, including JSTOR, Academic Search Premier, EBSCO host Education Source, Social Science Premier, IEEE, Wiley Online Library, OECD, and Springer Online Journals.

After an initial review of publication titles and abstracts, full publications were read. Bibliography backtracking of publications resulted in additional publications for review. The authors reviewed 54 publications and identified 47 publications that addressed learning analytics applications with relevance to the PK-12 context. These included empirical studies and research reviews in peer-reviewed journal articles, books, and reports from government or other research agencies. Exclusion criteria applied to the initial sample were: a focus on aspects of postsecondary education that are uncommon in secondary education (e.g., highly-specialized, fully online, asynchronous courses), a focus on pre-service teacher education, and a focus on a narrow learning format, such as games.

3.2. Methods for thematic analysis

Once the selection process was completed, we analyzed the selected literature using basic qualitative coding procedures [31]. Three authors reviewed a subset of publications and coded each publication's findings, considerations, and recommendations using both inductive and deductive coding. Initially, the authors used inductive coding to identify emerging themes related to specific opportunities and challenges relevant to use of learning analytics at the PK-12 level. These themes were then discussed among the full research team and another subset of the articles was reviewed by additional team members to test and further refine the themes to ensure appropriate representation and a sufficiently high level of interrater reliability (above 80%). Each researcher presented text excerpts to support their coding choices to ensure consistent interpretation and application of thematic codes.

The resulting themes were ‘value of learning analytics’, ‘tools for decision making’, ‘tools for instruction’, ‘creation and utilization’. Table 1 includes descriptions of these themes, which were then subsequently used for thematic coding of all selected literature.

Table 1.

Initial themes with descriptions.

Theme Description
Value of learning analytics Potential for learning analytics to enable substantial change in teaching, learning, and educational systems and policy.
Tools for decision making Influence of learning analytics on decisions such as policymaking, funding, state- and district-level instructional materials adoptions, introduction or continuation of programs and systems, and instructional design.
Tools for instruction Use of learning analytics to enhance teaching and learning, including providing formative feedback and engaging students in their own learning.
Creation and utilization Insight, considerations, and concerns regarding how learning analytics tools are and should be created and utilized to ensure outcomes are efficient, effective, and ethical.

Deductive coding was also used to identify whether content within each theme addressed opportunities and/or challenges and whether it related to classroom practice or a more systemic issue. For each application of a thematic code, we added two additional codes. The first characterized the related content as either an opportunity or a challenge; and the second distinguished whether it was relevant for the classroom level, meaning primarily for teachers, students, and parents/guardians, or at a systemic level, meaning for school administrators, district leaders, or policymakers. In this case, opportunities were seen as ways in which learning analytics can potentially improve PK-12 teaching and learning. Challenges were seen as risks or potentially harmful impacts of learning analytics or barriers to use in PK-12 education.

Additionally, the authors also used codes to identify any content that explicitly addressed equity and/or mathematics teaching and learning. For the purposes of coding, equity was considered to be ways in which learning analytics can support and/or undermine efforts to move toward more equitable education systems and classroom practices.

With a finalized set of codes, research team members each reviewed a subset of the literature. Codes were counted only once per publication, regardless of how often they were applied in the text. Thus, code counts reflect the number of publications in which a theme was present as an opportunity and/or challenge relevant at the classroom or systemic level. However, all relevant excerpts from a single publication were synthesized into the qualitative findings.

As a final note, while a significant body of literature focuses on learning analytics in higher education, adult education, and professional learning, we focus specifically on learning analytics as a tool for PK-12 education. Use of ‘classroom-level’ or ‘systemic-level’ reflects our interpretation of the literature through the lens of how learning analytics can impact or inform policies, processes, and practices in PK-12 education. Our discussion assumes that learning analytics tools are used to supplement and enhance, not replace the interactions and relationships that are a critical part of PK-12 classroom instruction.

4. Findings

Each of the four themes that emerged across the 47 selected publications (see Table 1) included at least one finding or recommendation relevant at a systemic level (for policymakers, district leaders, school administrators) and at a classroom level (for teachers, students, parents/guardians). This distinction is made in the more detailed discussion of findings that follows.

As we identified subthemes within each of these areas, we found that all text coded as either ‘tools for decision making’ or ‘tools for instruction’ was also coded as ‘value of learning analytics’. To capture the relationships between these themes, we synthesized the findings relating to tools for decision making and instruction as part of the broader ‘value of learning analytics’ theme. Table 2 provides counts for each theme, and we've used parentheses to indicate that these two themes and their code counts ended up as subsets of ‘value of learning analytics’.

Table 2.

Publication counts for opportunities and challenges for each theme by level (n = 47).

Theme Opportunities Count
Challenges Count
Systemic Classroom Totala Systemic Classroom Totala
Value of Learning Analytics 19 24 29 15 23 29
(Tools for Decision Making) [10] [12] [16] [6] [7] [10]
(Tools for Instruction) [8] [28] [28] [8] [7] [21]
Creation and Utilization 14 19 24 15 15 22
a

Totals represent number of publications coded for an opportunity or challenge at any level, accounting for publications coded for opportunities or challenges at both the systemic and the classroom level.

While related, there was less overlap with the findings and recommendations in the ‘creation and utilization’ theme and others. This theme included considerations, concerns, and recommendations related to creating and utilizing learning analytics in ways that either support the opportunities or contribute to the challenges undermining the potential value of learning analytics. Fig. 1 presents a visual conceptualization of the relationship between our resulting themes. This frames discussion of the findings.

Fig. 1.

Fig. 1

Visual conceptualization of relationship among themes.

Overall, eleven publications focused on opportunities related to equity and nine focused on challenges. It is notable that all content coded as an opportunity related to equity was also coded as an opportunity for either tools for decision making or instruction. For example, the potential for learning analytics to help teachers individualize instruction to better meet the needs of all students rather than teaching to the ‘average’ student [2] was coded as both ‘equity’ and ‘tools for instruction’. However, this was not the case for challenges related to equity. Of the nine publications addressing challenges related to equity, five raised issues that were not considered tools for decision making or instruction.

Another important finding is the presence of both opportunities and challenges across all themes. While the overarching ‘value of learning analytics’ theme included the same number of publications with challenges and opportunities, the other themes included slightly more literature addressing opportunities than challenges. Additionally, 30 of the 47 publications contained both an opportunity and a challenge for learning analytics. This suggests persistent uncertainty surrounding learning analytics and a need for researchers to continue to examine and discuss both the opportunities and challenges of learning analytics, particularly at for PK-12 education.

The following sections offer a synthesis of the qualitative findings related to the opportunities and challenges relevant for PK-12 education for each theme and level. Fig. 2 provides an overview of these themes and the subthemes that emerged within them.

Fig. 2.

Fig. 2

Summary of qualitative findings.

5. Opportunities for using learning analytics to improve PK-12 education

Many researchers, practitioners, and policymakers believe in the potential for learning analytics to improve learning, if used thoughtfully and ethically [32]. This section will discuss the general perceived and established value of learning analytics, including its potential to improve educational equity and provide valuable tools for decision making and instruction. Finally, it will discuss opportunities related to creation and utilization of learning analytics that can maximize their value. Each of these areas will address opportunities relevant at a systemic level (for school administrators, district leaders, and policymakers) and a classroom level (for teachers, students, and parents or guardians).

5.1. Value of learning analytics

A growing body of literature identifies ways in which learning analytics can substantially improve education at both the systemic and classroom levels. The following sections present the features identified as offering the greatest promise at each level, including those discussed in literature related to equity, decision making, and instruction.

5.2. Systemic level

Many believe learning analytics can transform educational structures, systems, and practices [8,33]. Its value is grounded in the premise that strategically collecting and analyzing behavioral data can change the way educational stakeholders understand learning [2]. It is considered a way to bring together the learning sciences and data analytics to create possibilities for meaningful applications of data generated by the growing prevalence of online learning environments, further accelerated by the COVID-19 pandemic [34].

Informing Policies, Structures, and Critical Decisions. At the systemic level, learning analytics can provide information on the effectiveness of district-wide instructional materials adoptions, including teacher usage and impact on student achievement [18]. Identifying classroom, school, or population trends can inform or aid in understanding the impact of educational policies, structures, and investments, including both positive outcomes and unexpected consequences [2,8,32,35]. Learning analytics can also offer tools to more efficiently analyze resource distribution, learner achievement, administration, and financial trends to inform decision making at the national, state, district or school level [21,9]. This includes analysis of international assessments, such as the Program for International Student Assessment (PISA), with rich data that can be used to inform policy and decision making [36].

Equity. Many believe learning analytics can play an important role in improving equity at all levels of education. This can range from providing data to raise awareness of inequities to identifying individuals who would benefit most from individualized supports. At the systemic level, data revealing differences in learning outcomes between groups based on race, gender, socioeconomic status, or other demographics can motivate and inform decisions on institutional change [4]. Among other research-supported strategies for advancing equitable teaching and learning, one important strategy in PK-12 schools is early identification of students at risk for not achieving grade-level proficiencies or leaving PK-12 classrooms entirely. Learning analytics can identify various risk factors or early warning signs for students requiring additional, individualized support, enabling academic advisors to develop more equitable, individualized strategies and structures for student success [2].

5.3. Classroom level

Many see the value of learning analytics at the classroom level in its potential to provide real-time, individualized feedback that can inform decision making, shape instruction, and engage students in their own learning. As a tool for teacher reflection and efforts to enhance or refine instructional practices, it can also support equitable practices such as personalizing learning strategies and experiences, recognizing factors leading to success and underachievement and adapting instruction accordingly, and identifying and supporting students at risk of failing. The following opportunities were identified for the value of learning analytics at the classroom level, including those in literature coded as equity, tools for decision making, and tools for instruction:

Immediate, Individualized Feedback. Some researchers argue that the greatest promise of learning analytics is the ability to provide critical data through a variety of collection and reporting tools [37]. Particularly when generated by computer-based assessments and digital learning platforms, learning analytics provides formative feedback that can help both students and instructors better monitor progress towards achieving learning objectives [38,39,40,36,4,10]. As such, the value of learning analytics is also in the timing of the feedback [9]. Real-time data on learner behaviors, particularly when combined with observed behaviors and achievement data, enable timely responses, assistance, or interventions to address students’ needs and support their success in ways not possible with the delayed reporting of other assessment results [8,10]. This offers a positive alternative to standardized assessment results, which are often not available until the following academic year, when the resulting data can no longer be used to support student learning.

Particularly when teachers are challenged by large class sizes, learning analytics can recognize a variety of learning and behavior patterns allowing for distinct learner profiles and better individualized feedback [20,36,10]. Tools such as learning analytics dashboards provide critical information on learner activity, interactions, and engagement. This type of feedback is concrete and actionable, identifying class and individual trends that can be addressed by teachers, students, parents or guardians, and others who support student learning [2,35]. Additionally, learning analytics can lessen teacher workloads by offering automatic responses to student responses [15] and quickly gathering information on occurrence rates for student misconceptions [41].

Designing and Adapting Instruction. ‘Personalization’ is a leading keyword in research on data-driven education [42], and researchers credit learning analytics with helping teachers design and adapt instruction to address the individual needs of every student, instead of targeting the ‘average student’ [2,43]. Thus, learning analytics can support teachers' efforts to provide more equitable learning opportunities, including individualized supports where needed [35,10]. This may include addressing gaps in prior knowledge or identifying preferred learning modalities. For example, based on learning analytics from a digital mathematics learning tool, students of color were more likely to use the digital tool, and this usage was significantly correlated with higher performance on the end-of-year assessment [18].

Learning analytics also offers teachers valuable information to reflect on and refine their instructional materials and practices [7,36,35]. For example, it can help teachers decide whether parts of an exam or assignment were too easy or challenging for their students [9]. Additionally, learning analytics can raise teachers’ awareness of disparities in their teaching. Despite efforts from teachers to engage all students, many are unaware when they fall short [4]. For example, research has shown that teaching staff were more likely to support high-performing students [7]. Using learning analytics, these teachers became aware of this imbalance and made changes to address it and move toward more equitable school practices. Reflection using learning analytics data can help teachers consider their behaviors, become aware of bias, and identify future actions to support more equitable practices [4]. In fact, teachers who spent more time reflecting on and making sense of dashboard data were shown to incorporate more diverse teaching practices in their classrooms [43].

Engaging Students in Their Own Learning Process. Learning analytics supports both assessment of and assessment for learning by providing individual feedback directly to students in ways that can involve them in setting and monitoring their own learning goals [44]. Research has shown that using dashboards to help students understand their learning patterns can increase self-reflection and motivate them to change their strategies [40]. This promotes student agency by giving them tools to understand their own learning behaviors and encouraging them to reflect on their own learning practices [2].

Identifying Students at Risk of Failing. Finally, learning analytics can offer teachers a tool for identifying students who are at risk of poor classroom outcomes, including assignment or assessment failure [9]. Student participation and performance data provide automated alerts regarding low engagement or achievement [2,20,45]. Visualizations can improve accessibility for teachers, affording them earlier opportunities to catch students who may be having difficulty [21,35]. Dashboards displaying key performance indicators have been shown to successfully alert teachers when students are not on track to meet targeted learning goals [40].

5.4. Creation and utilization

The potential for learning analytics to improve educational policies, structures, practices, and outcomes is strongly dependent on strategic creation and effective utilization of data collection and reporting tools [37,45]. This section highlights ways in which careful consideration of issues related to creation and utilization can improve the effectiveness of learning analytics and support opportunities related to the value of learning analytics.

5.5. Systemic level

Stakeholder Input and Collaboration. Learning analytics can serve a range of education stakeholders and provide valuable information for education policy, systemic-level structures and decision making. For this to become a reality, the design of learning analytics data collection and reporting tools must take into account who will use the information, what they will use it for, when and how data will be collected and utilized, and potential implications for stakeholders at all levels [46]. The value of learning analytics is strongly dependent on careful design that reflects stakeholder input and aligns closely with their goals [2]. Thus, partnerships between district leaders and application designers are essential to ensure data collection is purposeful and useable, and that it directly informs and supports equitable practices.

Additionally, global efforts to share insight from early adoption models and policy recommendations are essential to wider adoption, a stronger evidence base, and sustained usage [47,48]. To date, early adopters of learning analytics tools have had limited opportunities to share examples of good practice and lessons learned [7]. The complexities of building a culture of data-informed decision making require collective efforts to share insight and practices for others to adapt and build on across contexts [49].

5.6. Classroom level

Researchers offer a range of recommendations for creation and utilization of learning analytics tools to ensure they are effective and valuable at the classroom level for teachers, students, and parents or guardians.

Assessment for Learning. As discussed, learning analytics can provide teachers a wealth of real-time information on students’ learning. In addition to using this information to assess student learning, learning analytics can be most impactful when used for continuous, formative assessment to shape student learning [4]. This requires teachers and students to see learning analytics as a learning tool rather than an evaluation tool, with class-level and individualized data utilized to adapt instruction and construct personalized feedback that promotes learning [38]. Sharing this feedback with those who play a role in student learning outside of school (e.g., parents, guardians, tutors) can extend targeted support for learning beyond the classroom.

Engaging Students as Partners in Their Own Learning. The value of learning analytics related to increasing student agency is reliant on learning analytics being designed and utilized in ways that involve students in setting and monitoring their own goals [44,50]. Researchers recommend designing learning analytics dashboards to empower students to examine their own patterns of learning and links between effort or engagement and outcomes [39]. Similarly, for utilization, others recommend treating students as partners in the interpretation and use of their data and empowering them to draw on learning analytics to identify, pursue, and reflect on strategies to improve outcomes [50]. This means involving students (and their partners in learning outside of school) in designing and monitoring data-informed strategies to meet self-identified goals for productive engagement and achievement.

Co--Creation and User-Informed Design. The value of learning analytics also depends on its ability to meet the needs of education stakeholders. This requires partnerships with continuous communication and collaboration between developers and future users of learning analytics tools for both data collection and reporting [47,37]. Currently, there are notable gaps between the potential roles researchers have identified for learning analytics, the ways in which tools are being designed by developers, and the ways in which they are being utilized by education stakeholders. Researchers recommend that more attention and resources be devoted to helping teachers, schools, and even students articulate what they are seeking from learning analytics ,[[7], [51]]. In fact, some propose involving students and parents in the design process to improve their understanding of the utility and value of real-time data and potentially increase informed consent, which is critical to data quality and representation [50].

To facilitate this, research supports the use of co-creation frameworks that include a series of touchpoints between developers and users. These interactions, focused on communication, usage, and continued support or service, can result in more sustained use, an expanded user base, and improved functionality [47].

Features to Optimize Usability. In addition to co-creation, researchers also recommend specific features to improve usability of learning analytics tools for teachers, students, and families. For example, learning analytics dashboards that were visually attractive and easy to understand were more likely to be perceived as useful and ultimately impact learning behaviors [40]. The same was true for learning analytics platforms designed to be adapted by teachers for use in diverse contexts and capable of shaping and assessing the impact of data-informed strategies for varying student groups [39,52].

It is also essential for tools to provide visualizations that make complex, multi-layered data accessible and easily understood by users with varying levels of statistical literacy. In fact, research has shown that when formative feedback is provided visually, students are more likely to understand what is required to improve their outcomes and achievement [20]. Data reporting tools must also take into account current research on teachers’ statistical literacy to maximize use and potential to improve teaching and learning [37].

6. Challenges of using learning analytics to improve PK-12 education

Growing excitement about the transformative potential of learning analytics for key aspects of education has also been met with concerns and skepticism. This section will discuss perceived and established challenges that undermine the potential value of learning analytics for PK-12 education at both the systemic and classroom levels, including risks to educational equity and limitations on use for decision making and instruction. It will also discuss issues related to the creation and utilization of learning analytics tools and the systems that contribute to these challenges, with considerations and recommendations for mitigating them, where possible.

6.1. Value of learning analytics

Some researchers and education stakeholders remain unconvinced about the value of learning analytics, citing limited empirical evidence of improved educational outcomes, uncertainty about the value of the resulting information, and in some cases, risks of perpetuating structural bias and causing harm to students, particularly historically marginalized populations.

6.2. Systemic level

Concerns and challenges related to the value of learning analytics at a systemic level focus on evidence of impact sought by policymakers, district leaders, and school administrators and questions about the nature of the provided information, its potential to undermine efforts to build more equitable education systems, and the generalizability of findings and applications.

Lack of Empirical Evidence. While learning analytics has been an emerging area of research, policy, and practice for decades, researchers continue to raise concerns about a persistent lack of compelling evidence related to the value of large-scale data for efforts by education organizations and leaders to improve teaching and learning [49]. In addition to citing insufficient evidence linking use of learning analytics to effective interventions that support student success [52], existing evidence seems skewed toward positive outcomes, with minimal focus on negative results or impacts [21]. Furthermore, while there has been substantial focus on learning analytics integration and effectiveness in higher education, there has been very little research on learning analytics in PK-12 contexts [48].

These concerns are primarily based on the lack of long-term or large-scale studies on the impact of learning analytics on improvement and success of educational organizations [7,48]. Moving beyond a series of case studies requires investments of time and resources by education organizations and state and federal agencies and officials that support education research through policy and funding. These investments are unlikely without improved measures and methods for ensuring the quality, validity, and reliability of learning analytics tools and sufficiently rigorous research findings [7,48,53].

Implicit Assumptions. One element underlying well-documented concerns about the value of learning analytics in education is that implicit assumptions and defaults are often allowed to shape the construction and interpretation of learning analytics tools without being sufficiently challenged [8,20]. The true value of learning analytics tools and data will remain constrained as long as their design and interpretation are shaped by the beliefs and biases of their creators, algorithms, and users. This is particularly true given that tools are typically designed based on the perceptions of learning and pedagogy held by the technical experts creating them, which may not reflect the expertise of educators [42].

The tendency of learning analytics to focus on measurable constructs and assumptions about teaching and learning can promote limiting conceptions of achievement or even cause harm to individuals or student groups. For instance, learning analytics tools are often designed to measure success as the absence of failure [8], and data can identify patterns of use or lack of use without providing information needed to understand the reasons for such patterns [18]. As a result, the methods and indicators used to analyze the resulting data can be a stronger factor in the accuracy of learning analytics than dataset features such as size or representation [3].

Implicit assumptions and biases in both creation and interpretation of learning analytics can have negative implications for equity. Algorithms with biases risk harming individuals or student groups exhibiting behaviors or characteristics that can be misinterpreted or assigned unwarranted causality. The resulting false identification or unwarranted labeling can negatively impact their opportunities for success [54]. Additionally, the potential for decisions to be made based on flawed analysis can also cause harm to students and teachers. In fact, research has shown that less than half, and as little as 11%, of educators’ interpretations of data are accurate [45].

Consequences of Labeling. Perhaps the most serious critique of learning analytics relates to limitations in data collection and how these limitations can fuel systemic inequity when they do not provide a complete picture of the learners or explanations for learner data. For example, learner groups may be classified as “at-risk” without explanation of the variables resulting in this label [8]. Often, narrow indicators of success make students from disadvantaged backgrounds more likely to be identified as being at-risk of failure or dropping out of school, even when additional factors suggest otherwise. Not only can these results perpetuate systemic and instructor biases, sharing such categorizations or labels with students can be damaging to their identity as learners. It can also lead to academic profiling, which can reinforce perceptions of disadvantage and promote self-fulfilling prophecies [8]. For example, being labeled as “at-risk” can create a self-perception of being likely to fail. This can lead to reduced effort or increased anxiety, both of which can negatively impact academic success.

Privacy and Consent. While scientific datasets have historically been central to fields such as physics and astronomy, learning analytics includes social datasets with information on individual behaviors combined with personal characteristics and demographics [54]. Thus, privacy and consent are issues that pose serious challenges to the value of learning analytics [13]. Given the potential for biases built into learning analytics algorithms and the harm of false or inappropriate labeling, privacy is particularly critical.

When combined with the potential for sustained use of student data, privacy concerns require careful consideration of policies and practices regarding privacy and consent [2]. Privacy and student data protection have historically required extensive protocols, review, and approval by institutional review boards at universities and other research organizations. The data collection and use involved with learning analytics require similar processes and compliance with legislation for PK-12 districts [7]. Students, parents, and teachers have a right to be informed of who will have access to their data, how long it will be accessible, and how it will be used [13].

As learning analytics becomes a more prominent tool for PK-12 organizations, policymakers throughout the world are struggling with how to best legislate and regulate use of student data and protection of student privacy [45]. This has led many to see privacy as an obstacle to fully embracing and exploring the value of learning analytics [32]. In addition, as privacy concerns continue to grow and place constraints on data use, resulting legislation will likely impact education research [54]. Growing concerns about the risk learning analytics may pose to student privacy are likely to limit consent for data use. This will undermine the value of learning analytics by weakening its potential to offer representative patterns and trends [2]. Thus, the value of learning analytics will depend on federal, state, and local policymakers’ ability to develop privacy laws that protect communities while still finding innovative ways to ensure the access and reporting to educational stakeholders needed to drive positive educational progress [32].

Generalizability. A final issue that poses a challenge to the value of learning analytics is the efficacy of learning analytics on a systemic level, including the generalizability of its models and findings [18]. Often, the data mining models developed for learning analytics are programmed and refined using data from a particular school, district, or state, while also being designed to offer information to inform broad educational policies, systems, and practices that extend beyond the context of their training dataset. These models have historically had limited ability to provide information that can be appropriately generalized to larger populations [3], and researchers have expressed concerns about using learning analytics dashboards for purposes other than those for which they were designed [14].

To address this issue, one option is to combine data from multiple localized models across a state, region, or district to more accurately examine larger-scale patterns and broader considerations. However, this raises issues regarding alignment and compatibility of datasets produced by differing models. There are multiple examples in which issues such as database integration, limited commitment of resources or political will to collecting data at the state or federal level, and time gaps between data collection and reporting undermined the generalizability of learning analytics [32].

6.3. Classroom level

Despite potential opportunities for learning analytics to improve teaching and learning, some remain cautious and unconvinced. The most significant challenges relate to limits on the information learning analytics can offer teachers and students, the actionable nature of this information, and the potential for this information to have harmful consequences.

Limited Information. One key challenge constraining the potential value of learning analytics is not capturing important dimensions of student learning [2]. For example, learning analytics dashboards are rarely designed to include affect and motivation, despite the critical role these play in student learning [14]. Additionally, data typically offers information on student learning behaviors and outcomes, without providing information on what is behind these behaviors or other affective components influencing the outcomes. For example, data may show patterns of decreased usage without information on reasons for the decrease [18]. In some cases, particularly in higher education, decreased usage may signal lack of engagement, which can lead to negative outcomes. However, the link between engagement with learning analytics systems and outcomes may look different in PK-12 settings where usage may decrease if students are working collaboratively or with a teacher or tutor during or after school. It may also be due to limited technology access outside of school, an issue the student cannot control.

Finally, the prominence of mixed methods research in education highlights the value of qualitative data to better understand the nature of quantitative data. Learning analytics research designs typically focus on quantifying impact without qualifying it, and researchers argue that to fully interpret and communicate effectiveness, learning analytics research must include student voices [20].

One-Dimensional Focus on Students. With a one-dimensional focus, there is little opportunity to better understand who the students are, their thinking processes, and what prior knowledge or experiences they may bring to their learning. Consequently, students’ cultural identities, school experiences, and funds of knowledge often have no weight in the data analysis or impact on plans for future instruction [5]. Making decisions based on narrow datasets can perpetuate racial inequities in education settings [55].

Researchers are concerned that the use of data analytics may lead to dehumanization, searching only for behavioral patterns among students with no concern for the “why” behind a behavior [10]. In particular, visualizations that do not provide context can lead to inappropriate interpretations of data. Thus, learning analytics is prone to a latent variable problem, meaning a variable that is important to a student's education but not captured is likely will be left unaddressed despite its importance [2].

Data and Information May Not be Actionable. Learning analytics can provide a wealth of information on engagement and learning activity; however, the value of this information is limited if stakeholders do not have the training and expertise to know what actions should be taken to improve student learning. In fact, learning analytics tools are considered underdeveloped with regard to informing instruction [11]. A focus on what is happening without equal emphasis on how to respond often means availability of an extensive amount of potentially valuable information that is never acted upon by intended users [7]. Additionally, without guidance on how learning analytics should inform instruction, researchers have seen wide variation in how teachers use the data [5]. As a result, while many recognize the value of learning analytics for identifying issues and areas of concern, there is little evidence of value with regard to informing actions to address these issues and improve teaching and learning.

Undermining Creativity and Individualism. Finally, there is some concern as to whether the differentiated feedback considered a value of learning analytics can be harmful. For example, algorithmically-produced expectations of the ‘average student’ become built into systems, raising questions as to whether the resulting “individualized instruction” undermines value for diverse thinking and problem-solving approaches by molding students to fit external constructs [42]. Similarly, dashboard designs have been criticized for inspiring competition between learners rather than creativity and knowledge development [34].

6.4. Creation and utilization

Many concerns about the potential value of learning analytics are tied to challenges related to how tools and systems are created and utilized at both the systemic and classroom levels. Among these challenges are obstacles related to building structures and cultures that support data-driven decision making and ensuring learning analytics are designed and used with a growth mindset rather than a deficit orientation focused on accountability.

6.5. Systemic level

Building Implementation Structures. A fundamental challenge in the creation and utilization of learning analytics is the persistent need to create federal, state, and organizational structures that can support sustained, safe, equitable, and effective use of learning analytics to drive change [7]. For example, effective use of learning analytics scaled beyond individual schools and districts requires state and federal systems to store and manage data. It also requires standardization of data, compatibility across platforms, and a data sharing infrastructure. Reliance on political will for the resources needed to develop and manage such a system, integrate data from multiple systems, and establish consistent consent, collection, and reporting practices across districts and states is a major issue [32,9]. In the United States, collection and retention of educational data have created political divisions around privacy concerns. Progress in this area will require solutions that address these concerns and a culture that supports data-driven decision making.

More Transparency and Research on Algorithms. A main reason for privacy concerns is that learning analytics algorithms can reflect harmful biases, and thus far, data has been collected and utilized in ways that predominantly reflect a deficit approach to education and a focus on accountability [8]. Given that the accuracy of learning analytics depends on the indicators and analysis methods, building structures and cultures of data-driven decision making requiring confidence that the algorithms used to collect and report data are equitable and safe from unintended consequences, particularly those that disproportionately affect groups distinguished by race, gender, or other demographics [3]. Many steps are still needed to question and investigate the implicit, default assumptions that shape both the creation and use of learning analytics [8,20].

One recommendation for this issue is full transparency and open-access publication of all algorithms that make a prediction, recommendation, or decision for or about a student, teacher, or school. This can ensure they are examined for bias or other features that might prioritize the interests of one group over another or developer interests over those of learners and teachers [3]. Another recommendation is to use socio-technical integration research (STIR), in which experts dig into the social and ethical aspects of both the design and designers of learning analytics, including dashboards and other utilization tools and systems, to ensure fair and equitable practices [56].

Ultimately, a key aspect of this issue comes down to disconnects between the education and pedagogical perspectives of the technological experts who design learning analytics and the teachers, students, and other educational stakeholders who use them [42]. There is a crucial need to distinguish between predictive and explanatory modeling and the role of each in education, particularly in efforts to combat deficit-orientations toward student learning and achievement [8]. In all cases where learning analytics are used to shape policy, actions, interventions, or system modifications, the predictors and the assumptions that shape these predictors must be regularly investigated for bias and fairness, especially by those who are part of the community of learners the predictions are targeting [3].

6.6. Classroom level

Co-Creation Challenges. While researchers agree on the need to involve users and other stakeholders in learning analytics development [7], there are challenges with the feasibility of this recommendation. The first is that participating in co-creation takes a significant amount of time. Teachers, students, and other stakeholders often do not have extra time to commit to involvement in the creation process [47]. Considering how the process could best utilize summer or other times when teachers and students are not as busy, may help to mitigate this challenge.

In addition to time, learning analytics creation typically requires an advanced level of digital literacy that many relevant stakeholders do not have, especially because learning analytics is still an emerging field. The need for training can make the process even more time-consuming, costly, and less attractive to relevant stakeholders [46].

Digital and Statistical Literacy. The need for training also extends to utilization. Learning analytics tools typically require a basic level of digital and statistical literacy, and users may need initial training and ongoing support for both technical aspects of use and understanding the data [32]. Additionally, students, parents, and other users are likely to require some level of training to use tools and interpret resulting data and visualizations [7,57]. For instance, while dashboards aim to help students reflect on and modify their learning behaviors, students must first understand the information and recognize its usefulness [40]. Researchers recommend that future work focus on strategies to mitigate this challenge, such as differentiated versions of dashboards to scaffold understanding and interpretation of data [58].

Usability. Adoption and continued usage are central to the value of learning analytics, and the way learning analytics are designed can limit usability for specific purposes. Desirable and useful features in tools such as dashboards can vary across contexts and stakeholders, including level of granularity and visualization techniques. Certain users may find too much information overwhelming or have difficulty understanding and interpreting complex visualizations while others may find this amount of information critical to its usefulness [13].

There is also a need for continued development of metrics and measures that capture a more complete picture of student learning behaviors [2]. For example, techniques that utilize grouping methods such as clustering risk losing the ability to analyze the individual behaviors, which is considered a distinct value of learning analytics.

One-Dimensional Data. Similarly, researchers raise the need for more human-centered designs that collect data on multidimensional human subjects and are interpreted through the unique lenses of its users [47]. This connects back to a need for qualitative data to construct a more complete picture of student learning. However, this not only requires more complex data collection processes, it also means dashboards and other reporting tools would need to support cross-referencing and guidance on how to consider multiple factors or data sets when interpreting the data.

Supportfor Intervention or Pedagogical Innovation. A final challenge for adoption and usage is raising issues without providing tools to address them. Typically, learning analytics tools and systems are only designed to share information and prompt reflection. Thus, effective use may require supplementary tools to help users transform the information into action [34]. Teachers likely need support in recognizing how to address what they see in the data and what it means for their practice [38]. The same is true for students and partners in student learning without the experience or expertise needed to engage with or facilitate data-informed goal setting, strategy development, and progress monitoring.

In addition, learning analytics design and utilization have mainly focused on reducing drop-out rates and identifying students in need of extra support or intervention [7]. This deficit orientation in the design and use of learning analytics raises alarms without actionable recommendations for how to address them [49]. This is particularly problematic given that, with learning analytics, false positives have been shown to be significantly more damaging than false negatives [8]. This is true for students inappropriately labeled as ‘at risk’, because of algorithmic assumptions or biases. It is also true for teachers who may be labeled as failing, when the metrics for success are not appropriate for their students and context. Rather than cycles of identification and ‘reteaching’, some researchers urge learning analytics designers and users to consider and leverage differences in students' thinking, resources, and funds of knowledge to enhance learning rather than seeing them as predictors of academic success [5].

7. Conclusions and recommendations for future PK-12 learning analytics research

A growing body of literature on the use of learning analytics to improve higher education systems, institutions, teaching, and learning offers promising opportunities and important considerations for PK-12 education. Given the limited research on learning analytics in PK-12 contexts [48], a primary aim for this metasynthesis was to identify findings from research in higher education with potential relevance to PK-12 education. While we found notable support for the potential value of learning analytics for PK-12 education, there is an important need to build a stronger body of targeted research on the integration and effectiveness of learning analytics in PK-12 contexts.

Many developers, policymakers, education leaders, and practitioners have a vision of future classrooms as digital systems [59]. While there is typically less independent learning at the PK-12 level than in higher education, learning analytics can still provide important formative feedback in a more immediate timeframe than current PK-12 assessment systems and methods [44,4]. However, what these digital systems will or should look like in PK-12 contexts, where learning analytics will more likely supplement or enhance rather than replace teacher instruction, is still undetermined. This includes questions around training, support, and other measures needed to ensure all users (e.g., teachers, students, parents, tutors) have the basic competencies needed to use learning analytics tools, transform resulting data into actionable strategies, and monitor their effectiveness. PK-12 learners in particular have varying levels of proficiency with data and their visualizations. Thus, additional training, differentiated versions of dashboards, scaffolding within tools, and other strategies to mitigate this challenge will need to be an important focus for PK-12 learning analytics design and research [58].

Another important focus for PK-12 learning analytics is addressing disconnects between the pedagogical perspectives of learning analytics designers and users. Learning analytics tools typically reflect the perceptions of learning and pedagogy held by the technical experts creating them, which may not be consistent with the expertise of educators [42]. More effort and strategies are needed to facilitate opportunities for teachers and other users to share what they consider to be most useful in data collection and reporting tools. While users may not know what is possible, developers can use prototypes and examples of dashboards or other data displays to facilitate conversations about design, content, and usability. These should include criteria for actionable data and scaffolded approaches for transforming data into systemic- and classroom-level policies and practices focused on support structures that reflect a growth mindset rather than potentially harmful labeling, tracking, and accountability [50].

Deficit-oriented labeling can be potentially damaging for both students and teachers. Particularly with regard to teachers, learning analytics designed and used for evaluative purposes must include a comprehensive set of factors relevant to the school, classroom, and community context. For example, in subject areas such as mathematics and English language arts (ELA), consideration should be given to student data from previous years and whether data can identify substantial growth rather than just grade-level proficiency.

Current literature also presents a strong case for more complete approaches to collecting, reporting, and interpreting learning analytics that consider qualitative and quantitative data. This includes opportunities to identify other factors that may be relevant to understanding data, its implications, and appropriate actions for individual users, contexts, and circumstances.

Additionally, our findings demonstrate a critical need to distinguish between predictive and explanatory modeling and examine each to ensure PK-12 learning analytics do not reflect or perpetuate deficit-orientations towards student learning and achievement [8]. Predictors, along with the algorithms and assumptions that determine these predictors, must be completely transparent and constantly investigated for bias and fairness, by experts who include stakeholders in communities commonly identified in predictors [3]. Evidence of persistent bias further highlights the urgency of finding safe and practical solutions to concerns regarding privacy and consent that protect students and teachers and improve the quality and representation of learning analytics.

Overall, researchers have yet to reach consensus on the value of learning analytics, particularly with regard to opportunities and challenges relevant to PK-12 education. Opportunities and challenges show that there is still conflicting research and a lot to examine. Our findings also are consistent with suggestions that existing evidence seems more focused on positive outcomes and there is a continued need for more consideration of potential challenges and even harmful impacts associated with learning analytics at any level [21]. Additionally, discussions of challenges in the current literature tend to be more focused on raising issues without proposing solutions or strategies to address them. While we see a clear need for further research into the challenges associated with learning analytics, particularly at the PK-12 level, we recommend that these investigations focus on understanding the challenges and what can be done to mitigate them.

There is no shortage of optimism and excitement for the potential for learning analytics to offer transformative opportunities for PK-12 education. Thus, with extensively documented belief in the potential opportunities for LAs, the research community has a responsibility to both understand the risks and challenges and address them in ways that ensure they don't outweigh the potential benefits. This includes potentially harmful issues related to equity in education and logistical or usability issues that negatively impact adoption and sustained usage. Researchers, educational stakeholders, and learning analytics designers must put more effort into working together to understand and address the challenges that have left many unconvinced about the value of learning analytics.

Considerations for PK-12 mathematics education research and practice

While our analysis focused on findings with broad relevance to PK-12 education, several of these findings raise important questions for PK-12 mathematics education. Only 2 of the 47 articles focused specifically on mathematics education, so we cannot offer widely shared perspectives found in current literature. However, we can draw on the general themes within our findings from all reviewed literature, along with the experience and expertise of the mathematics education researchers, leaders, and practitioners on our team, to pose questions that will likely require future attention. These questions can provide some direction for research agendas that aim to support future developments in PK-12 mathematics curriculum, policy, and practice. This includes investigating what increased prevalence of learning analytics will mean for the PK-12 mathematics curriculum, the role of mathematics educators in supporting the use of learning analytics, and an increased focus on statistical literacy across disciplines.

Possibly the most relevant finding with implications for mathematics education is that many users (e.g., students, teachers, parents) do not have the statistical competencies to understand and interpret learning analytics data or the expanding range of visualizations created by developers. Additionally, many are unlikely to have the experience or expertise to develop and monitor data-informed strategies. This poses a challenge because current learning analytics tools are designed to provide information and prompt reflection without guidance on using the information to identify appropriate actions or adjust current practice. Given that the value of learning analytics depends on students, teachers, and parents or guardians’ ability to analyze, interpret and use data to inform and monitor practice, this raises important questions about future adaptations of current PK-12 mathematics curricula.

In higher education, departments that support mathematical and computer sciences have worked to shift degree requirements and create data science degree programs to prepare students for the growing prevalence of data in common career pathways. While many PK-12 standards and guidelines have increased their emphasis on data, they typically only extend as far as making inferences and do not close the loop of an iterative cycle that also includes implementing and monitoring inference-based strategies. For example, NCTM (2000) calls for students to learn appropriate methods to analyze data and make data-informed inferences and predictions, stopping short of critical examination of assumptions behind the data, consideration of other contextual factors, transforming these inferences and predictions into appropriate actions, and determining the effectiveness of these actions.

Given that interpretation is a critical factor in learning analytics accuracy [45], PK-12 education will likely bear responsibility for ensuring that users and those who support them can analyze learning analytics data through a critical lens. This raises the question of whether these competencies, skills, and critical perspectives are sufficiently addressed in current mathematics curricula and standards. If not, it raises questions about whether or how it should be addressed through future developments in mathematics curricula and standards. It also creates an urgent need for mathematics and statistics educators to emphasize and assess more than just basic data skills and competencies. Questions about data should extend beyond what it tells us and consider what it may not tell us, what actions or decisions it supports, and what future changes in the data would indicate that the action or decision was effective.

In addition to curricular questions, these findings also raise questions about the role of mathematics educators in supporting safe, ethical, and effective integration of learning analytics in PK-12 contexts. As it becomes more prevalent, professional and adult learning programs are needed for teachers, administrators, parents and other partners in student learning to ensure they can appropriately analyze and interpret data and develop and monitor data-informed systems and strategies to support student learning. In contexts where no designated personnel or resources are added, researchers should investigate whether increased integration of learning analytics impacts the role of mathematics educators in their broader school and district communities. Researchers, practitioners, and policymakers must attend to whether states, districts, and schools invest in effective models and strategies for supporting teachers, students, and parents across disciplines or if this critical need ends up falling on the shoulders of mathematics and statistics educators without added resources or support. This includes examining whether the role of mathematics educators disproportionately changes in schools and districts in communities where designated resources to support learning analytics integration are less likely to be available.

Finally, amidst concerns about learning analytics dehumanizing education [10], we encourage educators and researchers to consider the opportunities and challenges of intentional use of students’ own data to meet standards, benchmarks, or indicators related to data and statistical literacy. Engaging students in interpreting representations of data, drawing conclusions, making inferences or predictions, creating data-informed strategies for their own learning, and testing their strategies can potentially create more meaningful learning experiences than external or fixed data sets. Additionally, examining equity issues in algorithms can build connections between mathematics and social justice and help students become critical consumers of data and possibly even future critical designers of algorithms.

Funding

This research was generously supported by funding from the Bill and Melinda Gates Foundation.

Declarations

  • Review and/or approval by an ethics committee was not needed for this study because it did not include human subjects.

  • Informed consent was not required for this study because it did not include human subjects. All work referenced in this article is published and widely accessible.

Data availability statement

Data associated with the study has not been deposited into a publicly available repository; however, it will be made available on request.

CRediT authorship contribution statement

Catherine Paolucci: Conceptualization, Data curation, Formal analysis, Methodology, Writing – original draft, Writing – review & editing. Sam Vancini: Conceptualization, Formal analysis, Writing – original draft, Writing – review & editing. Richard T. Bex II: Formal analysis, Methodology, Writing – original draft, Writing – review & editing. Catherine Cavanaugh: Conceptualization, Data curation, Formal analysis, Funding acquisition, Methodology, Writing – original draft, Project administration, Writing – review & editing. Christine Salama: Conceptualization, Data curation, Formal analysis, Funding acquisition, Methodology, Writing – original draft. Zandra de Araujo: Conceptualization, Writing – original draft.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Contributor Information

Catherine Paolucci, Email: cpaolucci@wested.org.

Sam Vancini, Email: svancini@ufl.edu.

Richard T. Bex II, Email: rtbex@illinoisstate.edu.

Catherine Cavanaugh, Email: cathycavanaugh@coe.ufl.edu.

Christine Salama, Email: csalama@coe.ufl.edu.

Zandra de Araujo, Email: zdearaujo@coe.ufl.edu.

References

  • 1.Society for Learning Analytics Research (SoLAR) What is learning analytics? 2022. https://www.solaresearch.org/about/what-is-learning-analytics/ February)
  • 2.Aguilar S.J. Learning analytics: at the nexus of big data, digital innovation, and social justice in education. TechTrends. 2018;62(1):37–45. [Google Scholar]
  • 3.Bowers A.J. vol. 173. Blockchain and Robots; 2021. Early warning systems and indicators of dropping out of upper secondary school: the emerging role of digital technologies. (OECD Digital Education Outlook 2021 Pushing the Frontiers with Artificial Intelligence, Blockchain and Robots: Pushing the Frontiers with Artificial Intelligence). [Google Scholar]
  • 4.Reinholz D.L., Stone-Johnstone A., Shah N. Walking the walk: using classroom analytics to support instructors to address implicit bias in teaching. Int. J. Acad. Dev. 2020;25(3):259–272. doi: 10.1080/1360144X.2019.1692211. [DOI] [Google Scholar]
  • 5.Garner B., Thorne J.K., Horn I.S. Teachers interpreting data for instructional decisions: where does equity come in? J. Educ. Adm. 2017;55(4):407–426. doi: 10.1108/JEA-09-2016-0106. [DOI] [Google Scholar]
  • 6.Holstein K., Doroudi S. Companion Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK 19) ACM; 2019. Fairness and equity in learning analytics systems (FairLAK) pp. 500–503. [Google Scholar]
  • 7.Ferguson R., Brasher A., Clow D., Cooper A., Hillaire G., Mittelmeier J., Rienties B., Ullmann T., Vuorikari R. In: Joint Research Centre Science for Policy Report; EUR 28294 EN (Pp. 1– 152) Vuorikari R., Castaño Muñoz J., editors. Publications Office of the European Union; 2016. Research evidence on the use of learning analytics: implications for education policy. [DOI] [Google Scholar]
  • 8.Archer E., Prinsloo P. Speaking the unspoken in learning analytics: troubling the defaults. Assess Eval. High Educ. 2019;45(6):888–900. doi: 10.1080/02602938.2019.1694863. [DOI] [Google Scholar]
  • 9.Klašnja‐Milićević A., Ivanović M., Budimac Z. Data science in education: big data and learning analytics. Comput. Appl. Eng. Educ. 2017;25(6):1066–1078. doi: 10.1002/cae.21844. [DOI] [Google Scholar]
  • 10.Tempelaar D. Supporting the less-adaptive student: the role of learning analytics, formative assessment and blended learning. Assess Eval. High Educ. 2020;45(4):579–593. doi: 10.1080/02602938.2019.1677855. [DOI] [Google Scholar]
  • 11.Na K.S., Tasir Z. 2017 International Conference on Learning and Teaching in Computing and Engineering (LaTICE) IEEE; 2017. A systematic review of learning analytics intervention contributing to student success in online learning; pp. 62–68. [DOI] [Google Scholar]
  • 12.Rankin J.G. Routledge; 2016. Standards for Reporting Data to Educators: what Educational Leaders Should Know and Demand. [Google Scholar]
  • 13.Schwendimann B.A., Rodriguez-Triana M.J., Vozniuk A., Prieto L.P., Boroujeni M.S., Holzer A., Gillet D., Dillenbourg P. Perceiving learning at a glance: a systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies. 2016;10(1):30–41. [Google Scholar]
  • 14.Valle N., Antonenko P., Dawson K., Huggins‐Manley A.C. Staying on target: a systematic literature review on learner‐facing learning analytics dashboards. Br. J. Educ. Technol. 2021;52(4):1724–1748. [Google Scholar]
  • 15.Shin D., Shim J. A systematic review on data mining for mathematics and science education. Int. J. Sci. Math. Educ. 2021;19(4):639–659. doi: 10.1007/s10763-020-10085-7. [DOI] [Google Scholar]
  • 16.Faridhan Y.E., Loch B., Walker L. Proceedings of Electric Dreams. Proceedings Ascilite 2013 Sydney. Australasian Society for Computers in Learning in Tertiary Education; 2013. Improving retention in first-year mathematics using learning analytics; pp. 278–282.https://www.learntechlib.org/p/171138/ [Google Scholar]
  • 17.Martin T., Petrick Smith C., Forsgren N., Aghababyan A., Janisiewicz P., Baker S. Learning fractions by splitting: using learning analytics to illuminate the development of mathematical understanding. J. Learn. Sci. 2015;24(4):593–637. [Google Scholar]
  • 18.Ahn J., Beck A., Rice J., Foster M. Exploring issues of implementation, equity, and student achievement with educational software in the DC public schools. AERA Open. 2016;2(4) doi: 10.1177/2332858416667726. [DOI] [Google Scholar]
  • 19.Vought R.T. 2018. Charting a Course For Success: America's Strategy For STEM Education. United States Office of Science and Technology Policy.https://files.eric.ed.gov/fulltext/ED590474.pdf Retrieved from: [Google Scholar]
  • 20.Francis P., Broughan C., Foster C., Wilson C. Thinking critically about learning analytics, student outcomes, and equity of attainment. Assess Eval. High Educ. 2020;45(6):811–821. [Google Scholar]
  • 21.Ferguson R., Clow D. Proceedings of the Seventh International Learning Analytics & Knowledge Conference (LAK 17) ACM; New York, NY: 2017. Where is the evidence? A call to action for learning analytics; pp. 56–65. [Google Scholar]
  • 22.Hughes B. The math classroom all students deserve [Blog post] 2022. https://usprogram.gatesfoundation.org/news-and-insights/articles/the-math-classroom-all-students-deserve October 18)
  • 23.Building STEM Education on a Sound Mathematical Foundation. NCTM.; 2018. National council of supervisors of mathematics [NCSM] & national council of teachers of mathematics [NCTM]https://www.nctm.org/Standards-and-Positions/Position-Statements/Building-STEM-Education-on-a-Sound-Mathematical-Foundation/ [Google Scholar]
  • 24.National Council of Teachers of Mathematics [NCTM] National Council of Teachers of Mathematics; Reston, VA: 2000. Principles and Standards for School Mathematics. [Google Scholar]
  • 25.National Science and Technology Council [NSTC] Office of Science and Technology Policy; Washington, DC: 2018. Charting a Course for Success: America's Strategy for STEM Education.https://www.whitehouse.gov/wp-content/uploads/2018/12/STEM-Education-Strategic-Plan-2018.pdf Retrieved from. [Google Scholar]
  • 26.Thorne S., Jensen L., Kearney M.H., Noblit G., Sandelowski M. Qualitative metasynthesis: reflections on methodological orientation and ideological agenda. Qual. Health Res. 2004;14(10):1342–1365. doi: 10.1177/1049732304269888. [DOI] [PubMed] [Google Scholar]
  • 27.Thunder K., Berry R.Q., III Research commentary: the promise of qualitative metasynthesis for mathematics education. J. Res. Math. Educ. 2016;47(4):318–337. [Google Scholar]
  • 28.Hoon C. Meta-synthesis of qualitative case studies: an approach to theory building. Organ. Res. Methods. 2013;16(4):522–556. [Google Scholar]
  • 29.Lachal J., Revah-Levy A., Orri M., Moro M.R. Metasynthesis: an original method to synthesize qualitative literature in psychiatry. Front. Psychiatr. 2017;8:269. doi: 10.3389/fpsyt.2017.00269. PMID: 29249996; PMCID: PMC5716974. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Krippendorff K. SAGE; 2012. Content Analysis: an Introduction to its Methodology. [Google Scholar]
  • 31.Merriam S.B., Tisdell E.J. fourth ed. Jossey Bass; 2016. Qualitative Research: A Guide to Design and Implementation. [Google Scholar]
  • 32.Damian H., Hosh K., Kavuma K., Previs D., Cavanaugh C. Microsoft Education; 2019. Data and Education Transformation A Maturity Model. [Google Scholar]
  • 33.Tempelaar D., Rienties B., Nguyen Q. Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies. 2017;10(1):6–16. January 01. [Google Scholar]
  • 34.Jivet I., Scheffel M., Drachsler H., Specht M. Proceedings of European Conference on Technology Enhanced Learning. Springer; 2017. Awareness is not enough: pitfalls of learning analytics dashboards in the educational practice; pp. 82–96. [Google Scholar]
  • 35.Raza A., Penuel W.R., Jacobs J., Sumner T. In: Learner and User Experience Research: an Introduction for the Field of Learning Design & Technology. Schmidt M., Tawfik A.A., Jahnke I., Earnshaw Y., editors. EdTech Books; 2020. Supporting equity in schools: using visual learning analytics to understand learners' classroom experiences.https://edtechbooks.org/ux/supporting_school_equity [Google Scholar]
  • 36.Peña-Ayala A. Learning Analytics: fundaments, applications, and trends. A view of the current state of the art to enhance e-learning. 2017 [Google Scholar]
  • 37.Rankin J.G. Taylor and Francis; 2016. Designing Data Reports that Work: A Guide for Creating Data Systems in Schools and Districts. [Google Scholar]
  • 38.Admiraal W., Vermeulen J., Bulterman-Bos J. Teaching with learning analytics: how to connect computer-based assessment data with classroom instruction? Technol. Pedagog. Educ. 2020;29(5):577–591. [Google Scholar]
  • 39.Charleer S., Klerkx J., Duval E., De Laet T., Verbert K. In: Adaptive and Adaptable Learning: 11th European Conference on Technology Enhanced Learning, EC-TEL. Verbert K., Sharples M., Klobucar T., editors. Springer; Lyon, France: 2016. Creating effective learning analytics dashboards: lessons Learnt; pp. 42–56. [Google Scholar]
  • 40.Park Y., Jo I.H. Factors that affect the success of learning analytics dashboards. Educ. Technol. Res. Dev. 2019;67(6):1547–1571. [Google Scholar]
  • 41.Xhakaj F., Aleven V., McLaren B.M. Proceedings of European Conference on Technology Enhanced Learning. Springer; 2017. Effects of a teacher dashboard for an intelligent tutoring system on teacher knowledge, lesson planning, lessons and student learning; pp. 315–329. September) [Google Scholar]
  • 42.Williamson B. Digital education governance: data visualization, predictive analytics, and ‘real-time’policy instruments. J. Educ. Pol. 2016;31(2):123–141. [Google Scholar]
  • 43.Molenaar I., Knoop-van C.C. Proceedings of European Conference on Technology Enhanced Learning. Springer; 2017. Teacher dashboards in practice: usage and impact; pp. 125–138. [Google Scholar]
  • 44.Heritage M., Wylie C. Reaping the benefits of assessment for learning: achievement, identity, and equity. ZDM: Mathematics Education. 2018;50(4):729–741. doi: 10.1007/s11858-018-0943-3. [DOI] [Google Scholar]
  • 45.Rankin J.G. Taylor and Francis; 2016. How to Make Data Work: A Guide for Educational Leaders. [Google Scholar]
  • 46.Buckingham S.S., Ferguson R., Martinez-Maldonado R. Human-centred learning analytics. Journal of Learning Analytics. 2019;6(2):1–9. [Google Scholar]
  • 47.Dollinger M., Liu D., Arthars N., Lodge J.M. Working together in learning analytics towards the co-creation of value. Journal of Learning Analytics. 2019;6(2):10–26. doi: 10.18608/jla.2019.62.2. [DOI] [Google Scholar]
  • 48.Ifenthaler D. OECD Digital Education Outlook 2021 Pushing the Frontiers with Artificial Intelligence, Blockchain and Robots: Pushing the Frontiers with Artificial Intelligence. Blockchain and Robots; 2021. Learning analytics for school and system management; p. 161. [Google Scholar]
  • 49.Niemi D., Pea R., Saxberg B., Clark R. IAP; Charlotte, NC: 2018. Learning Analytics in Education. [Google Scholar]
  • 50.Slade S., Tait A. 2019. Global Guidelines: Ethics in Learning Analytics.https://www.learntechlib.org/p/208251/ Retrieved from. [Google Scholar]
  • 51.Sarmiento J.P., Campos F., Wise A. Companion Proceedings of the 10th International Learning Analytics & Knowledge Conference (LAK 2020) 2020. Engaging students as co-designers of learning analytics; pp. 29–32. [Google Scholar]
  • 52.Ifenthaler D., Mah D.K., Yau J.Y.K., editors. Utilizing Learning Analytics to Support Study Success. Springer; 2019. [Google Scholar]
  • 53.Scheffel M., Drachsler H., Toisoul C., Ternier S., Specht M. Proceedings of the European Conference on Technology Enhanced Learning. Springer; 2017. The proof of the pudding: examining validity and reliability of the evaluation framework for learning analytics; pp. 194–208. [Google Scholar]
  • 54.Krumm A., Means B., Bienkowski M. Routledge; 2018. Learning Analytics Goes to School: A Collaborative Approach to Improving Education. [Google Scholar]
  • 55.Horn I.S. Accountability as a design for teacher learning: sensemaking about mathematics and equity in the NCLB era. Urban Educ. 2018;53(3):382–408. doi: 10.1177/0042085916646625. March 01. [DOI] [Google Scholar]
  • 56.Jones K., McCoy C. Companion Proceedings 9th International Conference on Learning Analytics & Knowledge (LAK19); 2019. Ethics in Praxis: Socio-Technical Integration Research in Learning Analytics. [Google Scholar]
  • 57.Kim J., Jo I.H., Park Y. Effects of learning analytics dashboard: analyzing the relations among dashboard utilization, satisfaction, and learning achievement. Asia Pac. Educ. Rev. 2016;17(1):13–24. [Google Scholar]
  • 58.Park Y., Jo I.H. Development of the learning analytics dashboard to support students' learning performance. J. Univers. Comput. Sci. 2015;21(1):110. [Google Scholar]
  • 59.Dillenbourg P. Digital Education Outlook 2021: Pushing the Frontiers with AI, Blockchain, and Robots. OECD Publishing; 2021. Classroom analytics: zooming out from a pupil to a classroom. Classroom analytics: zooming out from a pupil to a classroom; pp. 105–118. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Data associated with the study has not been deposited into a publicly available repository; however, it will be made available on request.


Articles from Heliyon are provided here courtesy of Elsevier

RESOURCES