Skip to main content
BMC Psychology logoLink to BMC Psychology
. 2025 Dec 18;14:42. doi: 10.1186/s40359-025-03836-0

Mathematics teachers’ AI literacy, anxiety, and perceptions of AI integration in mathematics education: a mixed-methods study

Çiğdem İnci Kuzu 1,
PMCID: PMC12797407  PMID: 41408576

Abstract

Background

The rapid advancements in artificial intelligence (AI) technologies are fundamentally transforming mathematics teaching processes and offering new pedagogical opportunities within instructional environments. However, the effective use of these technologies is closely related to mathematics teachers’ levels of knowledge, awareness, attitudes, and skills regarding AI. The purpose of this study is to examine the relationship between mathematics teachers’ AI literacy and AI anxiety, to conduct an in-depth analysis of their perceptions regarding the integration of AI into mathematics education, and to evaluate the effects of variables such as watching AI-related films, technology use, and age on this process.

Methods

This study employed a mixed-methods design. In the quantitative phase, a predictive correlational model was employed, while in the qualitative phase, a case study approach was utilized. Data were collected from 251 mathematics teachers working in various regions of Türkiye. The quantitative data were analyzed using a range of statistical analysis techniques, whereas the qualitative data were evaluated through content analysis.

Results

The findings indicate that mathematics teachers’ levels of AI literacy are above average, whereas their levels of AI anxiety are below average. A significant and negative relationship was found between AI literacy and AI anxiety. Furthermore, the level of technology use in mathematics instruction was identified as the strongest predictor of both AI literacy and AI anxiety. The results also revealed that mathematics teachers’ most prominent anxiety is that the excessive use of AI tools may weaken students’ independent thinking and problem-solving skills. In addition, anxiety regarding the potential weakening of the teaching role and the possibility that AI could replace teachers were also noteworthy.

Conclusions

Professional development programs should encompass not only the fundamental technological features of AI but also its pedagogical contributions to mathematics instruction. Mathematics teachers should be provided with opportunities to observe how AI supports key instructional processes such as differentiated instruction, formative assessment, and conceptual visualization. Furthermore, training modules should aim to develop teachers’ abilities to critically evaluate AI-generated mathematical content in terms of accuracy and pedagogical appropriateness. Through such targeted training, teachers can enhance their AI literacy and create safe and pedagogically meaningful digital learning environments for their students.

Supplementary Information

The online version contains supplementary material available at 10.1186/s40359-025-03836-0.

Keywords: Artificial intelligence, Mathematics teacher, Literacy, Anxiety

Introduction

In recent years, the rapid development of artificial intelligence (AI) technologies has begun to fundamentally transform instructional processes at all levels of education systems [1]. Digitalization enables the development of new methods to make education more efficient [2]. In this context, AI leads to significant changes by leading an important educational transformation process [3]. The integration of AI fosters a highly interactive and adaptive educational ecosystem, enabling educators and learners alike to engage in innovative pedagogical and learning practices [4, 5]. Empirical evidence suggests that incorporating generative AI technologies into instructional processes can significantly enhance teachers’ pedagogical proficiency by providing instantaneous feedback and data-informed insights, thereby facilitating more strategic and evidence-based instructional decision-making [6].

Mathematics education, in particular, stands out as one of the fields that can benefit the most from the opportunities offered by AI, such as data analytics, personalized learning, error detection, and automated feedback [7]. AI-supported instructional systems can analyze students’ problem-solving strategies, adapt learning processes in real time, and provide teachers with the opportunity to identify students’ conceptual errors at earlier stages [8, 9]. Therefore, the effective use of AI in mathematics teaching is crucial for both enhancing the quality of instruction and fostering students’ higher-order thinking skills [10]. Besides, AI-supported mathematics education platforms (Mathletics, Mathspace, Mathway, Wolfram Alpha) provide more effective educational opportunities by offering personalized learning experiences to students [11]. AI tools like ChatGPT personalize instruction by creating rich mathematical tasks, while applications like Photomath use AI to solve mathematical problems using visual data [12]. AI can be considered a personalized teacher, providing students instant feedback and opportunities to correct mistakes [13, 14].

The implementation of AI’s potential in educational environments is not solely dependent on the presence of technological tools. However, it is also closely related to teachers’ knowledge, skills, and attitudes toward these technologies [15]. Teachers’ levels of AI literacy—that is, their ability to understand, critically evaluate, and effectively utilize these technologies within pedagogical contexts—constitute a crucial factor in the integration of AI into teaching processes [16]. Conversely, teachers’ AI anxiety, arising from concerns about technological complexity, ethical uncertainties, or the transformation of professional roles, may pose a significant affective barrier to this integration [17]. Davis’s Technology Acceptance Model (TAM) posits that individuals’ intentions to use a technology are determined by perceived ease of use and perceived usefulness, and it has been widely employed to explain the adoption of educational technologies [18, 19]. Mishra and Koehler’s TPACK framework elucidates the interplay among teachers’ technological, pedagogical, and content knowledge, enabling an analysis of how technology is integrated with pedagogical and disciplinary knowledge [20]. The Teacher AI Readiness approach evaluates teachers’ preparedness to use AI-based technologies across cognitive, affective, and practical dimensions [21]. Taken together, these three models demonstrate that the relationship between teachers’ AI literacy and AI anxiety is multidimensional: while TAM explains usage intention through cognitive and affective factors, TPACK provides a lens for understanding pedagogical integration, and Teacher AI Readiness encompasses the knowledge- and emotion-based factors that shape teachers’ readiness for AI. This integrated theoretical framework thus enables a systematic investigation of the relationships among the study variables—AI literacy, AI anxiety, and individual differences—offering a comprehensive basis for understanding the effective use of AI in education.

Understanding the factors that influence the use of artificial intelligence (AI) in education has emerged as a critical research need within the educational technology literature [22]. However, studies in this area remain limited, and comprehensive research examining the cognitive and affective factors that influence AI integration—particularly within the context of mathematics education—remains insufficient [23, 24]. In this regard, the primary aim of the present study is to investigate the relationship between mathematics teachers’ AI literacy and AI anxiety. Additionally, the study considers that the level of AI technology use may vary depending on factors such as age, technology use habits, and exposure to AI-related films [25]. Ultimately, this study aims to address gaps in the literature by examining the relationship between AI literacy and AI anxiety among mathematics teachers, as well as the influence of various demographic and experiential variables. Another gap addressed by the study is that most existing AI-based research has been conducted in Western contexts, offering limited data from countries with different demographic and cultural characteristics [26]. Therefore, the present research focuses on mathematics teachers in Türkiye. The rapidly digitalizing educational landscape in Türkiye, its young population, and the growing emphasis on AI-based learning further enhance the relevance and significance of this study [27].

AI literacy

AI literacy is considered a necessary skill for individuals to participate effectively in the digital world in the 21st century [28, 29]. Kandlhofer and colleagues first defined this concept as an approach to understanding the fundamental knowledge and concepts behind AI-based technologies [30]. It has been emphasized that AI literacy has become a necessary digital competency for computer scientists and all segments of society. AI literacy includes identifying AI products, effectively using these technologies, and critically evaluating them while adhering to ethical standards [29, 31].

AI literacy enables individuals to comprehend and effectively utilize AI technologies, while also equipping them with knowledge about future career prospects and job opportunities, thereby enhancing their overall competence and expertise. Consequently, in today’s world, possessing AI literacy has become increasingly necessary for every individual [9]. AI literacy plays a crucial role in integrating AI into the field of education [32, 33]. It is defined as the ability to understand the fundamental concepts and technologies of AI [34]. This definition extends beyond basic knowledge to encompass ethical evaluation and critical thinking skills, which are particularly important for mathematics teachers and teacher candidates. Such a comprehensive form of literacy directly influences the quality of education by shaping not only how frequently but also how appropriately AI is applied. In mathematics education, AI literacy enables educators to utilize advanced AI-supported tools, such as mathematical modeling and data analysis, to improve instructional efficiency and foster critical thinking and ethical awareness among students [3537]. Effective teacher education in AI literacy should involve not only understanding the functions of generative AI but also exploring its ethical dimensions and preparing teachers to use AI responsibly in educational environments. There is an increasing call for educational programs to adopt multidisciplinary approaches that integrate AI with disciplines such as mathematics and computer science. Through such integration, teachers can be equipped to leverage the opportunities presented by AI while effectively addressing the ethical challenges it brings [38, 39].

AI anxiety

Anxiety is defined as an emotional and physiological response that arises in reaction to perceived threats or unresolved fears [40]. In the 21st century, the rapid development of technology can lead individuals to experience fear and anxiety if they are unable to adapt to technological changes [41]. In this context, individuals may experience anxiety toward different technologies for various reasons. Computer anxiety refers to the fear and apprehension individuals feel when working with or interacting with computers [42]. Similarly, AI anxiety is defined as an individual’s avoidance of interacting with AI due to fear or concern [43]. It can generally be described as the stress, fear, and discomfort experienced when using AI technologies.

AI anxiety has garnered increasing attention in recent years, particularly within the scientific community, and various frameworks have been proposed to explain this concept. Johnson and Verdicchio define this anxiety as an emotional state that causes individuals to avoid interacting with AI. According to them, this anxiety also encompasses concerns about AI spiraling out of control and negatively impacting human life [44].

AI anxiety is an important area of research, particularly in the education sector. Kaya and colleagues found that AI anxiety is a significant factor shaping individuals’ attitudes toward AI, but not all components of this anxiety are strong predictors of AI attitudes. For example, learning anxiety and AI anxiety predicted more negative attitudes toward AI, while job-change anxiety and sociotechnical blindness (the inability to recognize that AI systems interact with humans) did not significantly influence positive or negative attitudes toward AI [45].

In education, teachers’ AI anxiety is often related to the application of technology in the classroom. Eyüp and Kayhan’s research revealed that preservice teachers’ AI anxiety is particularly focused on the possibility of losing their jobs in the future and the potential for AI to spiral out of control. However, it has also been determined that teachers are less concerned about staying up-to-date on AI’s technological development and learning processes [46]. In addition to these concerns, teachers’ distrust of technology hinders the effective use of educational technologies [47]. Ertmer and Ottenbreit-Leftwich noted that teachers’ lack of knowledge about technology hinders the effective use of technology in the classroom [21]. The psychological impact of technological advancements on individuals is also an important topic. Dinello argues that technology can have various negative effects on human psychology, raising issues such as addiction, mind control, and loss of empathy. These psychological effects lead individuals to experience fear and anxiety regarding technology [48].

As a result, anxiety about AI is increasing in parallel with the integration of this technology into social life. The uncertainty surrounding technological change and the future impact of AI fuels individuals’ anxieties about these technologies [44, 49]. Understanding and managing anxiety about AI is critical to ensuring the more acceptable and effective use of these technologies in society.

The relationship between AI anxiety and literacy

In recent years, research on the influence of AI-based technologies on individual attitudes and their potential applications in education has increased significantly. These studies delve deeply into the impact of AI applications in education and teachers’ anxiety about these technologies [5052]. In particular, research on how teachers can use these technologies more effectively in the classroom environment is solidifying its place as a new field in the literature [53].

Reviews on the impact of AI in education reveal its role in individual education, innovative teaching methods, technology-supported assessment processes, and potential to improve student-teacher communication [54]. However, the challenges faced in applying AI in education are also a significant research topic. For example, Alam discussed the obstacles faced by integrating these technologies in education and pointed out the strategies needed to overcome these challenges [55].

The support provided by smart tutoring systems and the facilitation of evaluation processes constitute another dimension of research on how AI can be effectively used in education [56]. Studies in this area explore the role of AI-based tools in classroom interactions, the integration of these tools by teachers, and their use by students. Student attitudes toward AI technology and preservice teachers’ intentions to use AI are also frequently addressed in research in this area [53, 57].

In recent years, as the integration of artificial intelligence technologies in education has accelerated, studies focusing on developing teachers’ competencies in these technologies have gained increasing importance. UNESCO, through its AI Competency Framework, has established an international standard guiding the use of AI in education. This framework emphasizes the need for teachers to enhance their AI literacy and provides a strategic approach to the role of AI tools in educational settings [58]. Additionally, the OECD Guidelines on Education and Teacher Competencies outline teacher competency standards aimed at increasing the use of digital tools in education. These guidelines offer guidance on how teachers can effectively integrate digital tools within pedagogical contexts. In this regard, understanding the relationship between AI anxiety and AI literacy emerges as a critical requirement for improving the effectiveness of technology use in education [59].

Research indicates that teachers with high levels of anxiety toward artificial AI tend to exhibit greater resistance to innovative instructional approaches and show more reluctance to adopt technology-based practices [60]. Studies conducted with mathematics teachers have revealed significant findings regarding the relationship between AI literacy and levels of trust and commitment toward these technologies [61]. Furthermore, it has been determined that mathematics teachers’ levels of AI literacy and their beliefs about technology have a significant influence on their tendency to adopt AI-based tools [62]. These studies suggest that as AI literacy among mathematics teachers increases, their tendency to use technology more effectively also rises; however, in the absence of adequate pedagogical support, emotional variables such as anxiety, trust, or dependence may be affected in different ways.

The relationship between AI anxiety and literacy in education has not yet been sufficiently explored. AI anxiety is often associated with a fear of uncertainty and a loss of control, while AI literacy reflects an individual’s knowledge and experience with these technologies. The acceptance and integration of AI technologies is often directly related to teachers’ knowledge and skills with these technologies [47]. Therefore, increasing teachers’ AI literacy levels may lead to a decrease in their anxiety levels.

The literature suggests that individual variables significantly influence teachers’ AI literacy and their anxiety related to AI. Factors such as gender, age, professional seniority, and attitudes toward AI have been identified as significant determinants influencing levels of AI-related attitudes and anxiety [45, 6366]. Moreover, watching AI-related films has also been identified as an important variable affecting AI literacy and AI anxiety [9, 67]. These findings suggest that individual variables have a significant impact on understanding the relationship between AI literacy and AI anxiety. Accordingly, the hypotheses proposed within the scope of this study are as follows:

  • H1: Mathematics teachers’ levels of AI literacy and AI anxiety differ according to their age.

  • H2: Mathematics teachers’ levels of AI literacy and AI anxiety differ according to their status of watching AI-related films.

  • H3: There is a negative (inverse) relationship between mathematics teachers’ levels of AI literacy and anxiety and their levels of technology use.

  • H4: There is a negative (inverse) relationship between mathematics teachers’ levels of AI literacy and their levels of AI-related anxiety.

Purpose

In this study, the dimensions of AI literacy and AI anxiety were examined comprehensively within a sample of mathematics teachers. Although the literature includes numerous studies on teachers’ attitudes toward technology, their digital pedagogical competencies, and TPACK levels [47, 68, 69], mixed-method studies that simultaneously address the AI-specific cognitive (literacy) and affective (anxiety) components are considerably limited. The scarcity of empirical studies examining the interaction between these two variables, specifically in the context of mathematics teachers, indicates that the field is still in its developmental stage. In this regard, the present study aims to investigate the relationship between mathematics teachers’ AI literacy and AI anxiety, as well as how these variables are associated with individual factors such as age, technology use, and exposure to AI-related films. Accordingly, the hypotheses (H1–H4) developed in this study aim to test the associations between teachers’ levels of AI literacy and anxiety, as well as their individual characteristics and technology use. Such an examination not only provides insights into teachers’ individual competency levels but also contributes to understanding the psychological and pedagogical barriers encountered during the integration of AI into mathematics education. By revealing how mathematics teachers perceive AI technologies, what types of concerns they experience when interacting with these tools, and how these factors are reflected in their professional practice, this study aims to offer concrete recommendations for teacher education programs and in-service training initiatives.

Method

Research model

This study, which aimed to examine the relationship between mathematics teachers’ anxiety levels toward AI and their literacy levels and to compare anxiety and literacy levels according to various socio-demographic variables, employed a mixed-methods approach. For this purpose, a predictive correlational research model was used in the quantitative part [70]. Predictive correlational studies are research models that aim to determine the existence of co-variation between two or more variables [71]. In the qualitative research process, a case study design was employed to thoroughly examine teachers’ views and experiences regarding the use of AI in education. The collected data were analyzed using the content analysis method [72].

Participants

The study population consisted of mathematics teachers, and the sample comprised 251 mathematics teachers selected through the convenience sampling method, a non-probability sampling technique. The research was conducted during the spring semester of the 2024–2025 academic year with mathematics teachers working in public schools in Türkiye. In this study, the convenience sampling method was employed, considering the accessibility and constraints related to time and cost for the participants. Reaching mathematics teachers online enabled the researcher to access a large sample in a short period. Moreover, the fact that the study’s target group consisted of a specific professional cohort enhanced the applicability of this sampling method. Convenience sampling offers advantages in terms of practicality and efficiency during the data collection process; however, it also carries the risk of the sample not fully representing the population. Therefore, the findings obtained bear certain limitations in terms of generalizability [73]. Despite this limitation, the inclusion of teachers with varying ages, seniority levels, and levels of technology use increased diversity and supported the reliability of the data. In this context, the convenience sampling method was considered a rational and functional choice, aligning with the research’s purpose, scope, and accessible participant profile.

In the demographic section, information was collected regarding teachers’ gender, age level, professional seniority (in years), socioeconomic level, technology usage level, possession of a personal computer, and watching movies related to AI. This comprehensive demographic profiling aimed to enhance the contextual validity of the findings within specific educational settings [74]. The participating mathematics teachers were coded according to their age and gender as follows: FT28 (female teacher, 28 years old), MT30 (male teacher, 30 years old), and so forth. Descriptive statistics on the socio-demographic variables of the teachers participating in the study are given in Table 1. Upon examining Table 1, it becomes apparent that the vast majority of mathematics teachers are women. When the age distributions are examined, it is determined that they are in close proportion; most people fall between the ages of 36 and 40, and the fewest people are in the 25 and under age group. When the professional seniority distributions are examined, it is determined that the number of teachers with 16 years or more of experience is the highest, and the number of teachers with 1–5 years of experience is the lowest. When their perceptions of socioeconomic level are examined, it is found that the vast majority consider themselves to be of middle-income status. When the teachers’ technology use levels are examined, it is determined that the vast majority are at medium and reasonable levels, and very few are at excellent levels. It is seen that the vast majority of teachers have personal computers. When the status of watching movies about AI is examined, it is found that most people have watched movies about AI.

Table 1.

Frequencies and percentages of teachers’ socio-demographic information

Variables Variable subgroups Frequency (f) Percentage (%)
Gender Female 177 70.5
Male 74 29.5
Age Level Aged 25 and under 25 10.0
26–30 39 15.5
31–35 48 19.1
36–40 54 21.5
41–45 44 17.5
Ages 46 and above 41 16.4
Professional seniority (years) 1–5 46 18.3
6–10 53 21.1
11–15 49 19.5
16 and above 103 41.1
Socioeconomic level Low income 14 5.6
Middle income 208 82.9
High income 29 11.5
Technology usage level Middle 79 31.5
Good 121 48.2
Excellent 51 20.3
Having a personal computer Yes 233 92.8
No 18 7.2
Watching movies about AI Yes 177 70.5
No 74 29.5
Total 251 100.0

Data collection process and tools

An online survey was used in the study’s data collection process. In this study, the online survey offered several advantages, including the ability to receive rapid feedback from a large audience with a small budget and to reduce errors in data entry and processing [75]. The web page addresses where the survey was located were sent to 300 mathematics teachers via the web environment by the researcher. Again, the researcher contacted the teachers online to request their completion of the survey. Two hundred fifty-one feedback were received from the online survey. The survey remained open to participants for approximately one week. After the data collection process was completed, the data review began. Since the data processing was carried out online, it was determined that there were no missing values in the data. For extreme value analyses, scale scores were converted to Z scores, and it was determined that there were no data outside the ± 3.00 range; in other words, no extreme values were found. As a result of missing value and extreme value analyses, no data were removed, and the study was conducted with 251 mathematics teachers.

The data collection tool for the research consists of an AI literacy scale, an anxiety scale regarding AI, and an open-ended question: “What are your anxieties about the integration and impact of AI technology on educational processes?” The primary reason for using only a single open-ended question to collect qualitative data in this study is that the qualitative component serves a complementary (supportive) function. The primary objective of this research is to quantitatively investigate the relationship between mathematics teachers’ anxiety levels regarding AI and their levels of AI literacy. Accordingly, qualitative data were utilized solely to obtain in-depth explanatory insights into teachers’ perspectives regarding their anxieties.

AI literacy scale

Polatgil and Güler adapted the Artificial Intelligence Literacy Scale (AILS) to measure the AI literacy of adults [76]. The scale consists of 12 questions. The scale was conducted with data collected from 536 people aged 18–60. As a result of the research, it was confirmed that the scale has a 4-dimensional structure that explains 92.24% of the total variance. Within the scope of the reliability study, it was found that the Cronbach Alpha (α) reliability coefficient of the scale as a whole was 0.939, and the indicators for confirmatory factor analysis were above the perfect limit. The scale consists of four dimensions, each with three questions. The questions belong to the dimensions in order; for example, the first three items belong to the awareness dimension, and the subsequent three items pertain to usage. The scale dimensions are awareness, usage, evaluation, and ethics. The questions on the scale are Likert-type questions, ranging from 1 to 5. The results of the confirmatory factor analysis for the AILS indicate that the model fits the data well, χ²(54) = 302.66, p < .001. The fit indices were as follows: CFI = 0.968, TLI/NNFI = 0.955, NFI = 0.952, and GFI = 0.976. The root mean square error of approximation (RMSEA) was 0.086 [90% CI = 0.060–0.112], and the standardized root mean square residual (SRMR) was 0.090; these values fall within the acceptable range for model fit. The standardized factor loadings ranged from 0.35 to 0.87, and the error variances ranged from 0.24 to 0.88. The findings confirm the factor structure of the scale and demonstrate that the model is consistent with the theoretical framework.

AI anxiety scale

The Artificial Intelligence Anxiety Scale (AIAS), developed by Wang and Wang [43], was adapted into Turkish by Akkaya and co-workers [77]. To determine the construct validity of the scale, Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were conducted. The EFA value was found to be 0.892, and Bartlett’s test yielded a chi-square value of χ² = 2847.749 (p = .000). The factor structure obtained from the EFA was tested using CFA, and the four-factor structure of the AIAS was confirmed. The goodness-of-fit indices (Δχ² = 260.120, df = 99, χ²/df = 2.627, NFI = 0.923, CFI = 0.950, RFI = 0.906, IFI = 0.951, TLI = 0.940, RMSEA = 0.078, p = .000) indicated acceptable model fit. To assess the reliability of the scale, internal consistency coefficients were calculated. The Cronbach’s alpha values were α = 0.937 for the overall scale, α = 0.948 for the Learning subscale, α = 0.895 for the Job Replacement subscale, α = 0.875 for the Sociotechnical Blindness subscale, and α = 0.950 for the AI Structuring subscale, indicating high internal consistency. The Turkish version of the AIAS consists of 16 items and four subscales. All findings support the reliability and validity of the Turkish form of the scale as a measurement instrument. The scale has been utilized by numerous researchers abroad [7880].

Reliability and normality analysis regarding data collection tools

To provide evidence for the reliability of measurements obtained from the AILS and AIAS measurement tools within the scope of the research, the Cronbach alpha coefficient, as well as the skewness and kurtosis coefficients, were used to assess the normality of the measurements. The findings are given in Table 2. When Table 2 is examined, it is observed that the skewness values of the measurements obtained from the measurement tools and sub-dimensions used in the study fall within the range of ± 3.00, and the kurtosis values are within the range of ± 10.00. This finding indicates that the distributions of the values exhibit a normal distribution [81]. According to this finding, it was assumed that the measurement tools and sub-dimensions exhibited normal distribution, and parametric statistics were used in the difference analyses. It was determined that the reliability coefficients obtained for the measurement tools and sub-factors ranged from 0.624 to 0.950. In the literature, a reliability value of 0.70 and above is considered a high level of reliability [71]. However, values between 0.60 and 0.70 are also stated to represent an acceptable level of reliability [82]. In this context, it was concluded that the reliability values obtained from the measurement tools used generally meet acceptable and high reliability levels, and the total scores demonstrate high reliability.

Table 2.

Descriptive statistics, skewness and kurtosis values, and reliability analysis of measurement tools

AILS
Min. Max. Inline graphic SS Skewness Kurtosis Cronbach α
Awareness 1.00 5.00 3.77 0.81 -0.45 -0.06 0.627
Usage 1.00 5.00 3.61 0.85 -0.43 -0.04 0.624
Evaluation 1.00 5.00 3.91 0.99 -0.86 0.42 0.908
Ethics 1.00 5.00 3.09 0.74 0.09 0.23 0.717
Total score 1.00 5.00 3.60 0.67 -0.79 0.46 0.811
AIAS
Min. Max. Inline graphic SS Skewness Kurtosis Cronbach α
Learning 1.00 5.00 2.20 1.07 1.07 0.52 0.946
Job switching 1.00 4.00 2.23 0.90 0.17 -0.98 0.875
Sociotechnical blindness 1.00 5.00 3.19 1.07 -0.43 -0.41 0.860
AI configuration 1.00 5.00 2.77 1.29 0.34 -1.07 0.944
Total score 1.00 5.00 2.70 0.96 0.24 -0.47 0.950

Data analysis

The findings obtained by applying various statistical methods were thoroughly examined during the data analysis. When the average values of the measurements obtained from the AILS are examined, it is evident that both the sub-dimensions and the total score exceed the midpoint of 3.00. This finding indicates that the literacy levels of the teachers are above average. When the average values of the measurements obtained from the AIAS are examined, it is seen that both the sub-dimensions and the total score are below the midpoint of 3.00. This finding indicates that the anxiety levels of the teachers are below average. It was determined to be above average in only one dimension (sociotechnical blindness). Then, the socio-demographic variables in the dataset were analyzed using descriptive statistics. In this context, frequency and percentage distributions were calculated.

Skewness and kurtosis statistics were considered when evaluating the normality assumption of the data set. In the study, reliability analysis was performed to evaluate the reliability of the measurement tools. In this context, Cronbach’s alpha (α) coefficient was used to determine the internal consistency coefficient. An independent samples t-test was applied to compare the independent variables between the two groups. Cohen’s d effect size was calculated to assess the practical significance of the significant difference. The following cut-off scores were used in the Cohen d interpretation: d = 0.00 to 0.29 range was considered insignificant, 0.30 to 0.49 range was considered minor, 0.50 to 0.79 range was considered medium, and 0.80 and above was considered a significant effect level. Additionally, a one-way analysis of variance [73] was conducted to examine the differences between multiple groups. The study used correlation analysis to determine the general structure and direction of the relationships between variables. This study calculated the Pearson correlation coefficient and evaluated the statistical significance of the relationships between variables. JASP (version 0.18.1.0) and SPSS (version 25) statistical package programs were used for all analyses. The generally accepted level of statistical significance was 0.05, and p-values below this level were considered significant. In addition, a regression analysis was conducted to examine the effects of the AI literacy sub-dimensions on teachers’ artificial intelligence anxiety. Finally, the teachers’ responses to the open-ended questions were analyzed and summarized using a content analysis approach. Two faculty members with expertise in mathematics education conducted the content analysis, and a consensus was reached. Direct quotes from teachers’ opinions were also included.

Results

To test the research questions aligned with the study’s purpose, the findings were obtained and reported in sequence. After determining that the measurement tools and sub-dimensions had a normal distribution and acceptable reliability, difference analyses were carried out in line with the study’s purpose. Accordingly, the differences in teachers’ AI literacy levels and anxiety levels towards AI were examined in relation to age, watching movies about AI, and technology usage level.

Firstly, a one-way analysis of variance was performed to determine the differences in AI literacy and anxiety levels among mathematics teachers according to age level, and the findings are presented in Tables 3 and 4. Table 3 shows that the AI literacy levels of mathematics teachers have a statistically significant difference according to age level (F(5−245) = 2.65; p ˂ 0.05). In order to determine the source of the significant difference, the Bonferroni test (since the homogeneity of variances was ensured) from multiple comparison tests was performed and it was determined that the AI literacy levels of teachers aged 46 and above were lower than those of teachers aged 31–35. When the differences in the sub-dimensions of the AI literacy scale according to age level were examined, it was found that the usage sub-dimension showed a statistically significant difference (p < .05). A Bonferroni test was conducted to determine the source of this difference. It was determined that the AI usage levels of teachers aged 46 and above were lower than those of teachers aged 31–35.

Table 3.

Comparison of teachers’ AI literacy level by age

Variable Age N Inline graphic SS sd F p Difference
Awareness Aged 25 and under (1) 25 4.00 0.70 5-245 2.67 0.023 --
26–30 (2) 39 3.73 0.86
31–35 (3) 48 3.98 0.66
36–40 (4) 54 3.77 0.87
41–45 (5) 44 3.79 0.75
Ages 46 and above (6) 41 3.42 0.87
Usage Aged 25 and under (1) 25 4.12 0.79 5-245 3.66 0.003 3 ˃ 6
26–30 (2) 39 3.67 0.91
31–35 (3) 48 3.77 0.83
36–40 (4) 54 3.46 0.73
41–45 (5) 44 3.55 0.80
Ages 46 and above (6) 41 3.33 0.93
Evaluation Aged 25 and under (1) 25 4.17 0.93 5-245 1.96 0.085 --
26–30 (2) 39 3.86 1.00
31–35 (3) 48 4.15 0.80
36–40 (4) 54 3.83 1.08
41–45 (5) 44 3.95 1.03
Ages 46 and above (6) 41 3.58 1.00
Ethics Aged 25 and under (1) 25 2.99 0.75 5-245 1.39 0.229 --
26–30 (2) 39 3.20 0.50
31–35 (3) 48 3.26 0.75
36–40 (4) 54 2.92 0.68
41–45 (5) 44 3.05 0.82
Ages 46 and above (6) 41 3.15 0.88
Total score Aged 25 and under (1) 25 3.82 0.66 5-245 2.65 0.024 3 ˃ 6
26–30 (2) 39 3.61 0.68
31–35 (3) 48 3.79 0.59
36–40 (4) 54 3.50 0.63
41–45 (5) 44 3.59 0.63
Ages 46 and above (6) 41 3.37 0.77

Table 4.

Comparison of teachers’ AI anxiety levels by age

Variable Age N Inline graphic SS sd F p Difference
Learning Aged 25 and under (1) 25 1.75 0.72 5-245 5.52 0.000 5 ˃ 1
26–30 (2) 39 2.00 0.84
31–35 (3) 48 1.76 0.92
36–40 (4) 54 2.35 1.06
41–45 (5) 44 2.60 1.27
Ages 46 and above (6) 41 2.55 1.12
Job change Aged 25 and under (1) 25 1.81 0.90 5-245 2.67 0.023 6 ˃ 1
26–30 (2) 39 2.29 0.86
31–35 (3) 48 2.07 0.84
36–40 (4) 54 2.17 0.87
41–45 (5) 44 2.41 0.95
Ages 46 and above (6) 41 2.50 0.90
Sociotechnical blindness Aged 25 and under (1) 25 2.76 1.06 5-245 1.40 0.226 --
26–30 (2) 39 3.15 1.13
31–35 (3) 48 3.09 1.10
36–40 (4) 54 3.24 0.94
41–45 (5) 44 3.38 1.05
Ages 46 and above (6) 41 3.35 1.14
AI configuration Aged 25 and under (1) 25 2.15 1.17 5-245 2.23 0.054 --
26–30 (2) 39 2.78 1.27
31–35 (3) 48 2.71 1.22
36–40 (4) 54 2.64 1.22
41–45 (5) 44 3.08 1.45
Ages 46 and above (6) 41 3.05 1.27
Total score Aged 25 and under (1) 25 2.21 0.86 5-245 3.56 0.004 5;6 ˃ 1
26–30 (2) 39 2.65 0.88
31–35 (3) 48 2.48 0.86
36–40 (4) 54 2.72 0.90
41–45 (5) 44 2.99 1.09
Ages 46 and above (6) 41 2.98 1.01

When Table 4 is examined, it is seen that the anxiety levels of mathematics teachers towards AI have a statistically significant difference according to age level (F(5−245) = 3.56, p ˂ 0.05). Bonferroni test was performed to determine the source of the significant difference, and it was determined that the anxiety levels of teachers aged 41–45 and 46 years and above were higher than those aged 25 and below. In terms of the difference in sub-dimensions of the anxiety scale towards AI according to age level, it was determined that the learning and job change sub-dimensions showed a statistically significant difference (p < .05). A Bonferroni test was conducted to identify the source of the significant difference. The results showed that for learning anxiety, teachers aged 46 and above had higher AI learning anxiety levels than teachers aged 25 and below. For job change anxiety, teachers aged 41–45 had higher AI learning anxiety levels than teachers aged 25 and below.

After examining the differences in AI literacy and anxiety levels among mathematics teachers by age level, the study also investigated the relationship between their movie-watching status and AI literacy, and the findings are presented in Tables 5 and 6.

Table 5.

Comparison of teachers’ AI literacy level according to their AI-related movie-watching status

Variable Watching movies N Inline graphic SS sd t p d
Awareness Yes 177 3.92 0.80 249 4.67 0.000 0.65
No 74 3.42 0.71
Usage Yes 177 3.79 0.78 249 5.50 0.000 0.76
No 74 3.18 0.87
Evaluation Yes 177 4.11 0.95 249 5.31 0.000 0.74
No 74 3.42 0.91
Ethics Yes 177 3.13 0.79 249 1.24 0.216 --
No 74 3.00 0.61
Total score Yes 177 3.74 0.63 249 5.54 0.000 0.77
No 74 3.26 0.63

Table 6.

Comparison of teachers’ anxiety level towards AI according to watching AI-related movies

Variable Watching movies N Inline graphic SS sd t p D (%95 CI)
Learning Yes 177 2.08 1.05 249 2.63 0.009 0.36 (0.09–0.64)
No 74 2.47 1.09
Job change Yes 177 2.11 0.85 249 3.37 0.001 0.47 (0.19–0.74)
No 74 2.52 0.94
Sociotechnical blindness Yes 177 3.11 1.04 249 1.91 0.057 --
No 74 3.39 1.13
AI configuration Yes 177 2.58 1.22 249 3.62 0.000 0.50 (0.23–0.77)
No 74 3.22 1.36
Total score Yes 177 2.57 0.91 249 3.35 0.001 0.46 (0.19–0.74)
No 74 3.01 1.02

The data in Table 5 show that the AI literacy level of mathematics teachers has a statistically significant difference according to whether they have watched a movie about AI (t249 = 5.54, p ˂ 0.05). When the means are examined, it is determined that the AI literacy levels of those who have watched a movie about AI are higher than those who have not. The effect size for the practical significance of the significant difference was calculated and found to be moderate. It is seen that the sub-dimensions of the AI literacy scale of mathematics teachers, except for ethics, have a statistically significant difference according to whether they have watched a movie about AI (p ˂ 0.05). When the means are examined, it is determined that the AI literacy scale of those who have watched a movie about AI is higher than those who have not. The effect size for the practical significance of the significant difference was calculated and found to be moderate.

When Table 6 is examined, it is seen that the anxiety levels of mathematics teachers towards AI have a statistically significant difference according to the status of watching movies about AI (t249 = 3.35; p ˂ 0.05). When the means are examined, it is determined that the anxiety levels of teachers who have not watched movies about AI are higher than those of others. This difference has a negligible effect size in practice. When the differences in the sub-dimensions of the anxiety scale towards AI are examined according to the status of watching movies about AI, it is determined that the other three sub-dimensions, except for the sociotechnical blindness sub-dimension, have a statistically significant difference (p ˂ 0.05). When the means are examined, it is determined that the averages of teachers who have not watched movies about AI are higher than others and, it has a medium effect level in practice.

The differences in AI literacy and anxiety levels among mathematics teachers, according to their level of technology use, were examined, and the findings are presented in Tables 7 and 8. According to Table 7, there is a statistically significant difference between the AI literacy levels of mathematics teachers according to their technology use level (F(2−248) = 26.11; p ˂ 0.05). The Bonferroni test results showed that teachers with an excellent level of technology use had higher AI literacy levels than teachers with good and medium levels of technology use. In contrast, those with good levels of technology use had higher averages than those with medium levels. Accordingly, as technology use increases, literacy towards AI also increases. When the differences in the sub-dimensions of the AI literacy scale according to the level of technology use were examined, it was determined that the other sub-dimensions, except for the ethics sub-dimension, had statistically significant differences (p ˂ 0.05). The Bonferroni test result showed that teachers with an excellent level of technology use had higher AI literacy levels than teachers with good and medium levels of technology use. In contrast, those with good levels of technology use had higher averages than those with medium levels. Accordingly, as technology use increases, literacy towards AI also increases.

Table 7.

Comparison of teachers’ AI literacy level according to technology usage level

Variable Technology usage level N Inline graphic SS sd F p Difference
Awareness Middle (1) 79 3.38 0.66 2-248 23.72 0.000

3 ˃ 1;2

2 ˃ 1

Good (2) 121 3.80 0.83
Excellent (3) 51 4.30 0.64
Usage Middle (1) 79 3.19 0.77 2-248 30.40 0.000

3 ˃ 1;2

2 ˃ 1

Good (2) 121 3.61 0.81
Excellent (3) 51 4.26 0.66
Evaluation Middle (1) 79 3.49 0.92 2-248 19.35 0.000

3 ˃ 1;2

2 ˃ 1

Good (2) 121 3.93 0.99
Excellent (3) 51 4.52 0.77
Ethics Middle (1) 79 3.08 0.71 2-248 0.99 0.373 --
Good (2) 121 3.05 0.73
Excellent (3) 51 3.22 0.83
Total score Middle (1) 79 3.28 0.60 2-248 26.11 0.000

3 ˃ 1;2

2 ˃ 1

Good (2) 121 3.60 0.65
Excellent (3) 51 4.08 0.50

Table 8.

Comparison of teachers’ AI anxiety level according to technology usage level

Variable Technology usage level N Inline graphic SS sd F p Difference
Learning Middle (1) 79 2.42 0.99 2-248 2.54 0.081 --
Good (2) 121 2.08 0.91
Excellent (3) 51 2.13 1.45
Job change Middle (1) 79 2.47 0.89 2-248 4.39 0.013 1 ˃ 2;3
Good (2) 121 2.15 0.86
Excellent (3) 51 2.05 0.93
Sociotechnical blindness Middle (1) 79 3.36 1.03 2-248 1.44 0.238 --
Good (2) 121 3.10 1.05
Excellent (3) 51 3.14 1.17
AI configuration Middle (1) 79 2.95 1.19 2-248 1.29 0.278 --
Good (2) 121 2.65 1.30
Excellent (3) 51 2.78 1.41
Total score Middle (1) 79 2.92 0.94 2-248 3.05 0.049 1 ˃ 2
Good (2) 121 2.59 0.87
Excellent (3) 51 2.62 1.16

When Table 8 is examined, it is seen that the anxiety levels of mathematics teachers towards AI have a statistically significant difference according to the level of technology use (F(2−248) = 3.05, p ˂ 0.05). A Bonferroni test was performed to determine the source of the difference. It was determined that the AI anxiety levels of teachers with a moderate level of technology use were higher than those with a good level of technology use. Accordingly, as technology use increases, anxiety towards AI decreases.

When the differences between the sub-dimensions of the AI anxiety scale according to the level of technology use were examined, it was determined that the other sub-dimensions, except for the job change sub-dimensions, did not have a statistically significant difference (p ˃ 0.05). As a result of the Bonferroni test, it was determined that the AI job change anxiety levels of teachers with a moderate level of technology use were higher than those with good and excellent levels of technology use. Accordingly, as technology use increases, anxiety towards AI decreases.

Next, the relationship between these two variables was examined, and the results are presented in Table 9. When Table 9 is examined, it is seen that the relationship between teachers’ AI literacy levels and their anxiety levels towards AI is statistically significant, small, and negative (p ˂ 0.05). Accordingly, it was determined that there was a negative relationship between mathematics teachers’ AI literacy levels and their anxiety levels towards AI. As shown in Table 9, there are strong and positive correlations among the subdimensions of AI literacy (awareness, use, evaluation, and ethics) (r = .83–0.91, p < .001). These high correlations indicate that the subdimensions represent the same construct and that calculating a total “Artificial Intelligence Literacy” score is statistically appropriate. Similarly, very strong correlations were observed among the subdimensions of AI anxiety (learning, job replacement, sociotechnical blindness, and structuring) (r = .77–0.92, p < .001). Therefore, it is also meaningful to compute a single total score for “Artificial Intelligence Anxiety” based on these subdimensions. A significant negative correlation was found between the two total scores (r = –.20, p < .001), providing a suitable basis for regression analysis. Accordingly, in the regression analysis, “Artificial Intelligence Anxiety” was treated as the dependent variable, while the subdimensions of “Artificial Intelligence Literacy” were included as predictor variables.

Table 9.

Relationship between literacy and anxiety

Variables 1. 2. 3. 4. 5. 6. 7. 8. 9.
1. Awareness --
2. Usage 0.70** --
3. Evaluation 0.75** 0.70** --
4. Ethics 0.09 0.24** 0.31** --
5. AI Literacy 0.83** 0.86** 0.91** 0.50** --
6. Learning − 0.23** − 0.43** − 0.18* − 0.14** − 0.31** --
7. Job switching − 0.18* − 0.29** − 0.13* 0.05 − 0.18** 0.61** --
8. Sociotechnical blindness − 0.04 − 0.16* 0.09 0.04 − 0.02 0.42** 0.78** --
9. AI configuration − 0.14* − 0.23** − 0.07 0.05 − 0.13 0.57** 0.79** 0.77** --
10. AI anxiety − 0.18* − 0.34** − 0.09 − 0.01 − 0.20 0.79** 0.92** 0.84** 0.89**

According to the results of the multiple linear regression analysis presented in Table 10, among the subdimensions of AI literacy, only usage (β = –0.53, t = − 5.91, p < .001) and evaluation (β = 0.30, t = 3.01, p = .003) were found to have a significant effect on AI anxiety. The other subdimensions—awareness (p = .722) and ethics (p = .667)—were not significant predictors. The overall model was statistically significant (F(4, 246) = 11.49, p < .001). The variance inflation factor (VIF) values ranged from 1.19 to 2.90, and tolerance values were above 0.34, indicating no multicollinearity problem. These findings suggest that teachers’ level of AI usage significantly reduces their anxiety levels, whereas their evaluation skills may slightly increase anxiety.

Table 10.

Variables influencing AI anxiety

Predictors B 95% CI β t p F (p) VIF Tolerance
Lower Upper
Constant 3.77 3.08 4.45 -- 10.86 0.000 11.49 (0.000) -- --
Awareness -0.04 -0.27 0.19 − 0.03 -0.36 0.722 2.81 0.36
Usage -0.60 -0.80 -0.40 − 0.53 -5.91 0.000 2.35 0.43
Evaluation 0.29 0.10 0.48 0.30 3.01 0.003 2.90 0.34
Ethics 0.04 -0.13 0.20 0.03 0.43 0.667 1.19 0.84

Mathematics teachers’ views on their anxiety about the integration and impact of AI technology in education

The opinions of mathematics teachers regarding their anxiety regarding the integration and impact of AI technology in education are presented in Table 11.

Table 11.

Views of mathematics teachers regarding their anxiety regarding the integration and impact of AI technology in education

Dimensions Codes f %
Regarding the teacher Unemployment anxiety 21 8
Change anxiety 13 5
Anxiety about the weakening of the teacher’s role 67 27
Anxiety about the inability to use technology 49 20
Regarding the student Anxiety about excessive dependency 47 19
Anxiety about not being able to use technology 34 14
Anxiety about educational inequality 24 10
Anxiety about data privacy 12 5
Anxiety about the weakening of the student-teacher relationship 34 14
Decreased independent thinking and problem-solving skills 77 31
Technical deficiencies Anxiety about the inability to properly integrate AI into education 17 7
Access constraints 57 23

An examination of Table 11 reveals that teachers are most concerned about students’ diminished independent thinking and problem-solving skills resulting from excessive use of AI. Other anxieties include weakening the teacher’s role and potential accessibility issues. Following these items, anxiety about teachers’ and students’ inability to use technology, anxieties that students may become overly dependent on AI, the failure to integrate AI into education properly, weakening of teacher-student relationships, data privacy anxiety, and teachers’ anxiety about adapting to change are identified.

Some teacher statements related to this category are as follows:

“The increased use of AI in classrooms could weaken students’ independent thinking and problem-solving skills. I am anxious that they will become overly dependent on technology.” (FT41).

“Will AI replace teachers? I am worried that our job security as teachers will be jeopardized. I am still very cautious about these technologies.” (MT35).

“I have serious concerns about data security and student privacy. There is no clear information about how the data collected by AI will be used and to whom it will be made available.” (MT49).

“Implementing AI in classrooms seems quite complex. I am not sure we will receive adequate training to adapt to new technologies.” (FT43).

“How much will AI change the teacher’s role in teaching? My relationships with students may become even more superficial in the digital environment.” (FT38).

“Some students do not have access to these technologies. This could increase inequality in education. Whether AI will be accessible to all students is a major question.” (FT40).

“If everything is technology-based, how can students solve problems they face in the real world? I feel like technology is taking the teaching process away from a human perspective.” (FT42).

“As AI’s role in the classroom increases, students’ creativity and critical thinking skills may be hindered. I feel these technologies could undermine the teacher’s role as a guide.” (FT29).

“Technology-based education systems could cause teachers to move away from traditional methodologies. I wonder if this could harm their professional identities.” (MT45).

“The use of AI raises many privacy concerns regarding data collection and processing. We want to know who uses student data and how it is secured.” (FT43).

“The implementation of AI in classrooms has called into question the central role of teachers in education. However, it is clear that teachers need not only technology but also a humanistic approach.” (FT54).

“Integrating AI into the teaching process can complicate the education system. It is unclear how teachers will adapt to new technologies and whether they will receive adequate support in this process.” (MT33).

“Not all students have equal access to such technologies. Technology-based education can lead to greater inequality for some students.” (FT49).

“Including technology in the teaching process can make human interactions more superficial. Forming deeper connections with students can be difficult in the digital environment.” (FT32).

Discussion

This study was conducted to identify the structural relationships between mathematics teachers’ AI literacy and their AI anxiety. It presents the findings regarding the evaluation of the model developed based on the testing of the theoretical hypotheses within the scope of the study (Table 12).

Table 12.

Findings related to the hypotheses on mathematics teachers’ AI literacy and anxiety

Hypothesis Description/ Propositions Result/ Supported?
H1 Mathematics teachers’ levels of AI literacy and AI anxiety differ according to their age. ✅ Supported (Significant differences were observed according to age).
H2 Mathematics teachers’ levels of AI literacy and AI anxiety differ according to their level of exposure to AI-related films. ✅ Supported (Significant differences were observed based on film-watching habits)
H3 There is a negative (inverse) relationship between mathematics teachers’ levels of AI literacy and anxiety, as well as their levels of technology use. ✅ Supported (High technology use → higher AI literacy and lower anxiety)
H4 There is a negative (inverse) relationship between mathematics teachers’ levels of AI literacy and their levels of anxiety related to AI. ✅ Supported (AI literacy ↑ → anxiety ↓)

Examining our first hypothesis regarding the age variable revealed that older mathematics teachers tend to have lower levels of AI literacy and higher levels of anxiety related to AI (p < .05). In particular, older teachers may experience difficulties adapting to the pedagogical and technical innovations introduced by AI technologies [83, 84]. Given the abstract and analytical nature of mathematics education, technology integration becomes more complex, and older teachers may therefore face greater challenges in adopting new digital tools in classroom settings. The increase in anxiety levels with age, especially regarding job displacement concerns, highlights the psychological barriers encountered during the digital transformation process [85]. These findings underscore the need for developing targeted support mechanisms and professional development programs tailored to the needs of experienced mathematics teachers, taking into account age-related differences.

Within the scope of the second hypothesis of the study, a significant relationship was found between mathematics teachers’ artificial intelligence literacy, anxiety levels, and their habits of watching artificial intelligence-themed movies (p < .05). Specifically, teachers who watched these films demonstrated higher levels of AI literacy. Such films present complex and abstract AI concepts through concrete examples and narratives, facilitating teachers’ understanding of the technology [86]. On the other hand, some films exaggerate the potential threats of AI, which can increase anxiety [43]. Therefore, the selective and conscious use of AI-themed media content is crucial in shaping mathematics teachers’ awareness and attitudes positively. Given the abstract nature of mathematics, AI-themed films that incorporate visual and narrative elements can serve not only as entertainment but also as alternative learning environments that deepen teachers’ technological awareness, stimulate critical thinking processes, and contribute to the development of an interdisciplinary learning culture.

The findings obtained within the scope of the third hypothesis indicate that mathematics teachers with higher levels of technology use exhibit greater AI literacy and lower levels of AI-related anxiety (p < .05). This suggests that exposure to technology is a decisive factor in understanding AI technologies and shaping attitudes toward them [87, 88]. The literature suggests that individuals who are intensively exposed to technology tend to exhibit higher digital literacy and lower anxiety [89, 90]. Experiential learning can transform mathematics teachers’ perceptions of AI, thereby reducing anxiety stemming from uncertainty [91]. In particular, repeated interactions with technology in mathematics education facilitate the modeling of complex concepts and the concretization of abstract thinking, thereby enhancing pedagogical effectiveness [92]. The integration of AI technologies into education not only requires mathematics teachers to employ technology as an effective pedagogical tool but also necessitates that they act with heightened social and ethical responsibility [93]. Given its capacity to support problem-solving and analytical thinking, the strategic use of AI can play a crucial role in fostering students’ independent cognitive skills [94]. Moreover, the digital transformation of education entails more than mere technological adoption; it demands a fundamental reconfiguration of teachers’ pedagogical approaches to fully leverage the potential of AI in learning environments [95].

Within the scope of the fourth hypothesis, it was observed that as mathematics teachers’ AI literacy levels increase, their anxiety toward AI significantly decreases. This finding suggests that teachers’ mastery of and experience with technology play a significant role in mitigating sources of uncertainty and anxiety [96, 97]. On the other hand, the evaluative aspect of AI can contribute to increased anxiety, reflecting teachers’ responsibility to use technology correctly from both pedagogical and ethical perspectives. This underscores the need for digital literacy to encompass not only technical knowledge but also pedagogical and ethical competencies [98]. While higher levels of AI usage among teachers significantly reduce anxiety, the assessment dimension may elevate anxiety to some extent due to the responsibilities and attentiveness it requires. Furthermore, the duration of exposure to technology and the level of experience are key factors in strengthening AI literacy [99]. Direct and repeated interactions with technology facilitate the concretization of abstract and analytical mathematical concepts [100], support effective pedagogical implementation, and help alleviate anxiety [101]. In this context, the negative relationship between AI literacy and anxiety highlights the importance of targeted and comprehensive professional development programs for teachers to ensure the successful digital transformation of mathematics education.

Mathematics teachers’ concerns and pedagogical implications regarding AI integration

The qualitative findings of the study indicate that mathematics teachers hold various concerns regarding the integration of AI. Teachers are particularly worried that students’ independent thinking and problem-solving skills may weaken as a result of excessive reliance on AI. These skills hold central importance in mathematics education. Problem-solving and critical thinking are indispensable for the internalization and transfer of mathematical concepts [102]. When students begin to passively consume the ready-made information provided by technology instead of taking an active role in the learning process, their abilities to generate ideas, analyze, and question may gradually deteriorate. This situation may prevent learning from progressing beyond a mere knowledge acquisition level, thereby hindering the development of students’ creative and analytical skills.

Furthermore, teachers express concerns that AI may undermine the role of teachers and reduce teacher–student interaction. In mathematics instruction, guidance and feedback processes play a critical role in helping students comprehend abstract concepts [103]. Teachers’ concerns suggest that these processes may be overshadowed by technology [104]. Moreover, as technology assumes a central position in the classroom, the teacher’s guiding and mentoring role may become diminished. This, in turn, could weaken teacher–student interaction and relegate the humanistic and social dimensions of education to a secondary position.

However, some teachers emphasize that the use of AI may exacerbate inequalities and that certain students may face a lack of access to technology. This highlights once again the importance of ensuring equity and accessibility in mathematics education [105]. In particular, the lack of technological access in developing regions may further deepen educational disparities [106]. Inequality in educational opportunities can hinder students’ access to AI-based resources, thereby contributing to the widening of the digital divide. This issue is especially evident in schools located in low-income and rural areas [107]. Teachers express concerns about the lack of necessary infrastructure to enable the effective use of AI, which in turn undermines their confidence in digital technologies.

Mathematics teachers also highlighted technical concerns. They expressed concerns about issues such as the inability to integrate AI effectively into the classroom, the ineffective use of technology, or insufficient training. In the context of mathematics, the pedagogically appropriate use of AI tools—such as conceptual visualizations, dynamic geometry software, and problem-solving simulations—is crucial. Therefore, professional development programs should support teachers not merely by introducing technological tools, but by providing examples that connect mathematical content with pedagogical applications [108].

Concerns regarding data security and ethical use are also frequently expressed by teachers. The use of AI in education raises significant ethical and legal concerns regarding the protection of student data. As AI technologies collect students’ personal information, uncertainties persist regarding how these data will be used and to whom they will be disclosed [109]. Teachers emphasize that the data collected by AI systems should be managed transparently. Moreover, it has been observed that teachers often lack sufficient training in digital ethics, which undermines their trust in technology [110]. This situation highlights the need for developing transparent data management policies and robust ethical standards to ensure the safe and responsible use of AI in education.

These findings confirm the need to enhance mathematics teachers’ AI literacy, strengthen pedagogical support, and provide ethical training. To reduce teachers’ concerns and enable the effective use of AI, it is recommended to clearly demonstrate the pedagogical benefits of AI in mathematics lessons (e.g., problem-solving simulations, conceptual visualizations), develop applications that support students’ independent and critical thinking skills, provide solutions that minimize issues of access and equity, and offer guidance on data security and ethical use.

Conclusion

The study’s findings indicate that teachers’ AI literacy is a key factor in significantly reducing AI-related anxiety. Therefore, professional development programs should focus not only on the technological features of AI but also on its tangible pedagogical benefits in the teaching process. This approach can help mathematics teachers perceive AI not as a threat but as a tool that supports learning, thereby promoting pedagogically sustainable technology integration. Mathematics teachers should systematically observe how AI addresses instructional challenges specific to mathematics education—such as differentiated instruction, formative assessment, and conceptual visualization. By directly linking technology to its instructional benefits, teachers’ intentions to adopt AI can be strengthened.

In addition, perceived risks associated with AI (e.g., algorithmic reliability, fairness and equity, students’ overreliance) should not be overlooked in teacher education. Training modules should aim to develop teachers’ abilities to critically evaluate mathematics content generated by AI. In this way, perceived risks can shift from being anxiety-inducing barriers to opportunities for developing professional judgment and critical awareness. This approach enables teachers not only to enhance their AI literacy and reduce their concerns but also to create a safe and pedagogically meaningful digital learning environment for their students.

Limitations and rationale

While this study offers both theoretical and practical insights into mathematics teachers’ concerns regarding the integration of AI and its impact on perceived mathematics literacy, several limitations should be acknowledged. The findings align with recent research suggesting that AI-related anxiety is a contextual emotion that may vary across geographical, cultural, and technological settings. Since all participants in this study were from Türkiye, the results may differ in other cultural and geographical contexts. In particular, the experiences and concerns of teachers working in rural areas or schools with limited technological access may not fully align with the findings of this study. This suggests that AI anxiety is not merely an individual emotion but is shaped by educational and social factors. Additionally, the predominance of female teachers in the study sample may limit the generalizability of the findings with respect to gender. The data primarily reflect the experiences of female mathematics teachers; therefore, the results may not fully align with findings derived from the experiences and perspectives of male teachers. This indicates a need for more detailed examinations of female teachers’ relationships with technology and AI. The influence of gender on technology use and pedagogical approaches warrants more comprehensive examination in future research.

Furthermore, the study employed a cross-sectional design. Therefore, causal inferences cannot be made regarding the impact of AI anxiety on pedagogical decisions or student outcomes in mathematics teaching. Teachers’ perceptions may not fully correspond to students’ actual mathematical performance. This limitation arises from the reliance on self-reported measures. Although self-reported data are valuable in reflecting teachers’ emotional and pedagogical experiences, potential common method bias should be taken into account.

Additionally, AI literacy and teaching beliefs were assessed using specific measurement instruments. Although these instruments are psychometrically reliable, they may not fully capture the complex structure of teachers’ pedagogical beliefs or the critical, ethical, and creative dimensions associated with AI. In mathematics education, not only procedural fluency but also conceptual understanding, problem solving, and mathematical reasoning are essential competencies. AI integration may influence these domains to varying degrees.

Considering these limitations, several directions for future research are recommended. Longitudinal studies could be conducted to examine the changes and effects of AI-related anxiety and literacy over time. Replicating studies with teacher samples from diverse geographical and cultural contexts would help enhance the generalizability of the findings. Field-based and observation-oriented methodologies could provide deeper insights into the relationship between teachers’ actual classroom practices and students’ mathematical outcomes. Additionally, AI literacy assessments should be expanded to include not only technical and pedagogical dimensions but also critical thinking competencies. Finally, the influence of ethical considerations and data security concerns on the adoption and integration of AI in mathematics education warrants more detailed investigation.

Supplementary Information

Acknowledgements

I would like to thank the participants of the study.

Authors’ contributions

All stages of the study were carried out by Ç.İ.K.

Funding

No grant was received at any stage of this research.

Data availability

The data that support the findings of this study are available from the author upon reasonable request.

Declarations

Ethics approval and consent to participate

All procedures involving human participants were conducted in accordance with the ethical standards of the institutional research committee and with the 1964 Declaration of Helsinki and its later amendments or comparable ethical standards. The ethical approval was granted by the Karabük University Social and Human Sciences Research Ethics Committee with the decision dated 27.09.2024 and numbered E.363459. Participants were explicitly informed of the voluntary nature of the study, their right to withdraw at any time, and the confidentiality of their responses. Before completing the questionnaire, all mathematics teachers voluntarily agreed to participate in the study by providing informed consent electronically. Data were anonymized before analysis to protect participant identities and ensure adherence to ethical research standards.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Sat M. The impact of AI integration in project Preparation in education course on pre-service teachers’ innovativeness, AI anxiety, attitudes, and acceptance. BMC Psychol. 2025;13(1):1297. 10.1186/s40359-025-03647-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Noster N, Gerber S, Siller H-S. Pre-Service teachers’ approaches in solving mathematics tasks with ChatGPT. Digit Experiences Math Educ. 2024;10(3):543–67. 10.1007/s40751-024-00155-8. [Google Scholar]
  • 3.Karabıyık MA, Yüksel AS, Tan FG. Sentiment analysis in the age of artificial intelligence: the rise of large Language models and comparison with classical approaches. Afyon Kocatepe Univ – J Sci Eng. 2024;24(6):1355–63. 10.35414/akufemubid.1484569. [Google Scholar]
  • 4.Adeleye OO, Eden CA, Adeniyi IS. Innovative teaching methodologies in the era of artificial intelligence: A review of inclusive educational practices. World J Adv Eng Technol Sci. 2024;11(2):069–79. 10.30574/wjaets.2024.11.2.0091. [Google Scholar]
  • 5.Tan S. Harnessing artificial intelligence for innovation in education. In: Rajaram K, editor. Learning intelligence: innovative and digital transformative learning strategies: cultural and social engineering perspectives. Singapore: Springer Nature Singapore; 2023. pp. 335–63. [Google Scholar]
  • 6.Sajja R, Sermet Y, Cwiertny D, Demir I. Integrating AI and learning analytics for Data-Driven pedagogical decisions and personalized interventions in education. Technol Knowl Learn. 2025. 10.1007/s10758-025-09897-9. [Google Scholar]
  • 7.Tang WK-W. Artificial intelligence in mathematics education: Trends, Challenges, and opportunities. Int J Res Math Educ. 2025;3(1):75–90. 10.24090/ijrme.v3i1.13496. [Google Scholar]
  • 8.Doğan A. Using artificial intelligence in primary school mathematics teaching. In: Mazı A, editor. The convergence of mathematics and AI: A new paradigm in education: A new paradigm in education. Hershey: IGI Global Scientific Publishing; 2025. pp. 319–48. [Google Scholar]
  • 9.Karaoğlan Yılmaz FG, Yılmaz R. Adaptation of artificial intelligence literacy scale into Turkish bilgi ve İletişim. Teknolojileri Dergisi. 2023;5(2):172–90. 10.53694/bited.1376831. [Google Scholar]
  • 10.Barana A, Marchisio M, Roman F, editors. Fostering problem solving and critical thinking in mathematics through generative artificial intelligence. 20th international conference on cognition and exploratory learning in digital age. Madeira Island. 2023.
  • 11.Park M. Applications and possibilities of artificial intelligence in mathematics education. Commun Math Educ. 2020;34(4):545–. 10.7468/JKSMEE.2020.34.4.545. 61. [Google Scholar]
  • 12.Pepin B, Buchholtz N, Salinas-Hernández U. A scoping survey of ChatGPT in mathematics education. Digit Experiences Math Educ. 2025;11(1):9–41. 10.1007/s40751-025-00172-1. [Google Scholar]
  • 13.Jančařík A, Novotná J, Michal J, editors. Artificial intelligence assistant for mathematics education. 21st European Conference on e-Learning-ECEL. Brighton: Reading. 2022.
  • 14.Yilmaz R, Karaoglan Yilmaz FG. The effect of generative artificial intelligence (AI)-based tool use on students’ computational thinking skills, programming self-efficacy and motivation. Computers Education: Artif Intell. 2023;4:100147. 10.1016/j.caeai.2023.100147. [Google Scholar]
  • 15.InciKuzuÇ, Kayabasi KE Determining the digital literacy levels of mathematics pre-service teachers. Probl Educ 21st Century. 2025;83(4):563–78. . 10.33225/pec/25.83.563. [Google Scholar]
  • 16.Pei B, Lu J, Jing X. Empowering preservice teachers’ AI literacy: current understanding, influential factors, and strategies for improvement. Computers Education: Artif Intell. 2025;8:100406. 10.1016/j.caeai.2025.100406. [Google Scholar]
  • 17.Karaca Karalinç C. Yapay zekâ kaygısının iş güvencesizliği algısı üzerindeki Etkisinde algılanan öz yeterliliğin rolü sağlık çalışanları üzerine Bir araştırma. İstanbul: İstanbul Gelişim University; 2025. [Google Scholar]
  • 18.Davis FD, Perceived, Usefulness. Perceived ease of use and user acceptance of information technology. MIS Q. 1989;13(3):319–40. 10.2307/249008. [Google Scholar]
  • 19.Duyku E. Investigating secondary school teachers’ use of WEB 2.0 technologies with the technology acceptance model [Master’s]. Turkey: Bursa Uludag University (Turkey); 2021. [Google Scholar]
  • 20.Mishra P, Koehler MJ. Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers Coll Record. 2006;108(6):1017–54. 10.1111/j.1467-9620.2006.00684.x. [Google Scholar]
  • 21.Ertmer PA, Ottenbreit-Leftwich AT. Teacher technology change. Journal of Research on Technology in Education. 2010;42(3):255 – 84. 10.1080/15391523.2010.10782551.
  • 22.Awang LA, Yusop FD, Danaee M. Current practices and future direction of artificial intelligence in mathematics education: A systematic review. Int Electron J Math Educ. 2025;20(2):em0823. 10.29333/iejme/16006. [Google Scholar]
  • 23.Li M. Integrating artificial intelligence in primary mathematics education: investigating internal and external influences on teacher adoption. Int J Sci Math Educ. 2025;23(5):1283–308. 10.1007/s10763-024-10515-w. [Google Scholar]
  • 24.Zhang C, Schießl J, Plößl L, Hofmann F, Gläser-Zikuda M. Acceptance of artificial intelligence among pre-service teachers: a multigroup analysis. Int J Educational Technol High Educ. 2023;20(1):49. 10.1186/s41239-023-00420-7. [Google Scholar]
  • 25.Cantas Ç, Soyer C, Batur Ö. Examination of undergraduate students’ artificial intelligence Anxiety, multidimensional 21st century Skills, and lifelong learning levels in terms of various variables. Turkish Online J Educational Technology-TOJET. 2024;23(3):29–53. [Google Scholar]
  • 26.Crompton H, Burke D. Artificial intelligence in higher education: the state of the field. Int J Educational Technol High Educ. 2023;20(1):22. 10.1186/s41239-023-00392-8. [Google Scholar]
  • 27.Doğan M, Celik A, Arslan H. AI in higher education: risks and opportunities from the academician perspective. Eur J Educ. 2025;60(1):e12863. 10.1111/ejed.12863. [Google Scholar]
  • 28.Druga S, Vu ST, Likhith E, Qiu T. Inclusive AI literacy for kids around the world. Proceedings of FabLearn 20192019. pp. 104 – 11.
  • 29.Lee I, Ali S, Zhang H, DiPaola D, Breazeal C. Developing middle school students’ AI literacy. Proceedings of the 52nd ACM technical symposium on computer science education; Virtual Event, USA: association for computing machinery. 2021. pp. 191–7.
  • 30.Kandlhofer M, Steinbauer G, Hirschmugl-Gaisch S, Huber P, editors. Artificial intelligence and computer science in education: from kindergarten to university. 2016 IEEE frontiers in education conference (FIE); 2016 12–15 Oct. 2016.
  • 31.Ng DTK, Leung JKL, Chu SKW, Qiao MS. Conceptualizing AI literacy: an exploratory review. Computers Education: Artif Intell. 2021;2:100041. 10.1016/j.caeai.2021.100041. [Google Scholar]
  • 32.Du H, Sun Y, Jiang H, Islam AYMA, Gu X. Exploring the effects of AI literacy in teacher learning: an empirical study. Humanit Social Sci Commun. 2024;11(1):559. 10.1057/s41599-024-03101-6. [Google Scholar]
  • 33.Venkatesh V, Davis FD. A theoretical extension of the technology acceptance model: four longitudinal field studies. Manage Sci. 2000;46(2):186–204. 10.1287/mnsc.46.2.186.11926. [Google Scholar]
  • 34.Acar IH, Altundal MN, Kırmızıtaş M, Kırbaşoğlu K. Can you see me at my worst? A latent profile analysis of students and teachers’ perceptions of student behavior problems. Curr Psychol. 2023;42(32):28107–18. 10.1007/s12144-022-03888-0. [Google Scholar]
  • 35.Azriani N, Islami N, Hermita N, Nor M, Syaodih E, Handayani H, et al. Implementing inquiry learning model to improve primary school students’ critical thinking on Earth and universe concept. J Phys: Conf Ser. 2019;1227(1):012033. 10.1088/1742-6596/1227/1/012033. [Google Scholar]
  • 36.Serholt S, Pareto L, Ekström S, Ljungblad S. Trouble and repair in Child–Robot interaction: A study of complex interactions with a robot tutee in a primary school classroom. Front Rob AI. 2020;7:1–13. 10.3389/frobt.2020.00046. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Matsuzaki T, Iwane H, Kobayashi M, Zhan Y, Fukasaku R, Kudo J, et al. Can an A.I. Win a medal in the mathematical olympiad? – Benchmarking mechanized mathematics on pre-university problems1. AI Commun. 2018;31(3):251–66. 10.3233/AIC-180762. [Google Scholar]
  • 38.Ning Y, Zhang C, Xu B, Zhou Y, Wijaya TT. Teachers’ AI-TPACK: exploring the relationship between knowledge elements. Sustain. 2024;16(3):978. 10.3390/su16030978.
  • 39.Sun F, Tian P, Sun D, Fan Y, Yang Y. Pre-service teachers’ inclination to integrate AI into STEM education: analysis of influencing factors. Br J Edu Technol. 2024;55(6):2574–96. 10.1111/bjet.13469. [Google Scholar]
  • 40.Epstein S. The nature of anxiety with emphasis upon its relationship to expectancy. In: Spielberger CD, editor. Anxiety: current trends in theory and research. Volume 2. New York: Academic; 1972. pp. 291–337. [Google Scholar]
  • 41.Atabek O. Teknoloji Korkusu. Eğitim bilimleri Alanında Uluslararası Araştırmalar. 7. Konya: Eğitim Yayınevi; 2021. pp. 115–38. [Google Scholar]
  • 42.Öztürk E. An investigation on prospective teacher’s computer anxiety and computer self efficacy based on several variables. Hacettepe Üniversitesi Eğitim Fakültesi Dergisi. 2013;44(44):275–86. [Google Scholar]
  • 43.Wang Y-Y, Wang Y-S. Development and validation of an artificial intelligence anxiety scale: an initial application in predicting motivated learning behavior. Interact Learn Environ. 2022;30(4):619–34. 10.1080/10494820.2019.1674887. [Google Scholar]
  • 44.Johnson DG, Verdicchio M, AI, Anxiety. J Association Inform Sci Technol. 2017;68(9):2267–70. 10.1002/asi.23867. [Google Scholar]
  • 45.Kaya F, Fatih A, Astrid S, Paul R, Okan Y, Demir Kaya M. The roles of personality Traits, AI Anxiety, and demographic factors in attitudes toward artificial intelligence. Int J Human–Computer Interact. 2024;40(2):497–514. 10.1080/10447318.2022.2151730. [Google Scholar]
  • 46.Eyüp B, Kayhan S. Pre-Service Turkish Language teachers’ anxiety and attitudes toward artificial intelligence. Int J Educ Lit Stud. 2023;11(4):43–56. 10.7575/aiac.ijels.v.11n.4p.43. [Google Scholar]
  • 47.Koehler M, Mishra P. What is technological pedagogical content knowledge (TPACK)? Contemp Issues Technol Teacher Educ. 2009;9(1):60–70. [Google Scholar]
  • 48.Dinello D. Technophobia! Seven engineered flesh: Biotechnology. Science fiction visions of posthuman technology. New York: University of Texas; 2006. pp. 180–222. [Google Scholar]
  • 49.Rachman S. A cognitive theory of obsessions. In: Sanavio E, editor. Behavior and cognitive therapy today. Oxford: Pergamon; 1998. pp. 209–22. [Google Scholar]
  • 50.Khare K, Stewart B, Khare A. Artificial intelligence and the student experience: an institutional perspective. IAFOR J Educ. 2018;6(3):63–78. 10.22492/ije.6.3.04. [Google Scholar]
  • 51.Popenici SAD, Kerr S. Exploring the impact of artificial intelligence on teaching and learning in higher education. Res Pract Technol Enhanced Learn. 2017;12(1):22. 10.1186/s41039-017-0062-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Taşçı G, Çelebi M. A new paradigm in education: artificial Intelli-gence in higher education. OPUS Int J Soc Researches. 2020;16(29):2346–70. 10.26466/opus.747634. [Google Scholar]
  • 53.Zawacki-Richter O, Marín VI, Bond M, Gouverneur F. Systematic review of research on artificial intelligence applications in higher education – where are the educators? Int J Educational Technol High Educ. 2019;16(1):39. 10.1186/s41239-019-0171-0. [Google Scholar]
  • 54.Chassignol M, Khoroshavin A, Klimova A, Bilyatdinova A. Artificial intelligence trends in education: a narrative overview. Procedia Comput Sci. 2018;136:16–24. 10.1016/j.procs.2018.08.233. [Google Scholar]
  • 55.Alam A. Possibilities and apprehensions in the landscape of artificial intelligence in education. 2021 International conference on computational intelligence and applications C. (ICCICA); 2021. Nagpur. 2021:1–8. 10.1109/iccica52458.2021.9697272
  • 56.Gamage KA, Dehideniya SC, Xu Z, Tang X. ChatGPT and higher education assessments: more opportunities than concerns? J Appl Learn Teach. 2023;6(2):358–69. 10.37074/jalt.2023.6.2.32. [Google Scholar]
  • 57.Ouyang F, Zheng L, Jiao P. Artificial intelligence in online higher education: A systematic review of empirical research from 2011 to 2020. Educ Inform Technol. 2022;27(6):7893–925. 10.1007/s10639-022-10925-9. [Google Scholar]
  • 58.Holmes W, Miao F. Guidance for generative AI in education and research. Paris: Unesco Publishing; 2023. [Google Scholar]
  • 59.OECD. OECD future of education and skills 2030: OECD learning compass 2030. A series of concept notes; 2019.
  • 60.Ayduğ D, Altınpulluk H. Are Turkish pre-service teachers worried about AI? A study on AI anxiety and digital literacy. AI & SOCIETY; 2025. 10.1007/s00146-025-02348-0.
  • 61.Wijaya TT, Yu Q, Cao Y, He Y, Leung FKS. Latent profile analysis of AI literacy and trust in mathematics teachers and their relations wit h AI dependency and 21st-Century skills. Behav Sci. 2024;14(11):1008. 10.3390/bs14111008. [DOI] [PMC free article] [PubMed]
  • 62.Lin T, Zhang J, Xiong B. Effects of technology perceptions, teacher beliefs, and AI literacy on AI technology adoption in sustainable mathematics education. Sustain. 2025;17(8): 3698. 10.3390/su17083698.
  • 63.Kovačević A, Demić E. The impact of Gender, Seniority, Knowledge, and interest on attitudes to artificial intelligence. IEEE Access. 2024;12:129765–75. 10.1109/ACCESS.2024.3454801. [Google Scholar]
  • 64.Dumagay AH Preservice teachers and AI in education 5.0: examining literacy, anxiety, and attitudes across gender, socioeconomic status, and training. EthAIca: J Ethics AI Crit Anal. 2025;4(432). 10.56294/ai2025432.
  • 65.Lund BD, Mannuru NR, Agbaji D. AI anxiety and fear: a look at perspectives of information science students and professionals towards artificial intelligence. J Inform Sci. 2024;01655515241282001. 10.1177/01655515241282001.
  • 66.Mart M, Kaya G. The examination of preschool teacher candidates’ attitudes towards artificial intelligence and their artificial intelligence literacy relationship. Edutech Res. 2024;2(1):91–109. [Google Scholar]
  • 67.Türten B. Artificial intelligence and cinema: opportunities and possibilities in filmmaking. J Anatolia Balkan Stud. 2024;7(14):399–425. 10.32953/abad.1539736. [Google Scholar]
  • 68.Şad S, Nalçacı Ö. Prospective teachers’ perceived competencies about integrating information and communication technologies into education Mersin Üniversitesi Eğitim. Fakültesi Dergisi. 2015;11(1):176–97. 10.17860/efd.16986. [Google Scholar]
  • 69.Türk N, Batuk B, Kaya A, Yıldırım O. What makes university students accept generative artificial intelligence? A moderated mediation model. BMC Psychol. 2025;13(1):1257. 10.1186/s40359-025-03559-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Şata M. Quantitative research approaches. In: Oğuz E, editor. Research methods in education. Ankara: Eğiten Kitap Yayıncılık; 2020. pp. 77–97. [Google Scholar]
  • 71.Karasar N. Scientific research method: concepts principles techniques. Ankara: Nobel Akademik Yayıncılık; 2024. p. 368. [Google Scholar]
  • 72.Yıldırım A, Şimşek H. Sosyal bilimlerde Nitel Araştırma Yöntemleri. 12 ed. Ankara: Seçkin Yayıncılık; 2021. [Google Scholar]
  • 73.Stratton SJ. Population research: convenience sampling strategies. Prehosp Disaster Med. 2021;36(4):373–4. 10.1017/S1049023X21000649. [DOI] [PubMed] [Google Scholar]
  • 74.Linardatos G, Apostolou D. Investigating high school students’ perception about digital comics creation in the classroom. Educ Inform Technol. 2023;28(8):10079–101. 10.1007/s10639-023-11581-3. [Google Scholar]
  • 75.Fraenkel J, Wallen N, Hyun H. How to design and evaluate research in education. 8 ed. New York: McGraw-Hill Education; 2012. [Google Scholar]
  • 76.Polatgil M, Güler A. Adaptation of artificial intelligence literacy scale into Turkish. J Quant Res Social Sci. 2023;3(2):99–114. [Google Scholar]
  • 77.Akkaya B, Özkan A, Özkan H. Artificial intelligence anxiety (AIA) scale: adaptation to Turkish, validity and reliability study. Alanya Acad Rev. 2021;5(2):1125–46. 10.29023/alanyaakademik.833668. [Google Scholar]
  • 78.Chiu TKF, Chai C-s. Sustainable curriculum planning for artificial intelligence education: a self-determination theory perspective. Sustainability. 2020;12(14). 10.3390/su12145568.
  • 79.Moreno-Guerrero A-J, López-Belmonte J, Marín-Marín J-A, Soler-Costa R. Scientific development of educational artificial intelligence in web of science. Future Internet. 2020;12(8). 10.3390/fi12080124.
  • 80.Dai Y, Chai C-S, Lin P-Y, Jong MS, Guo Y, Qin J. Promoting students’ well-being by developing their readiness for the artificial intelligence age. Sustainability. 2020;12(16). 10.3390/su12166597.
  • 81.Kline RB. Principles and practice of structural equation modeling. New York: Guilford; 2023. [Google Scholar]
  • 82.van Griethuijsen RALF, van Eijck MW, Haste H, den Brok PJ, Skinner NC, Mansour N, et al. Global patterns in students’ views of science and interest in science. Res Sci Educ. 2015;45(4):581–603. 10.1007/s11165-014-9438-6. [Google Scholar]
  • 83.Ateş V. Investigation of artificial intelligence literacy levels of university students in terms of some demographic variables Türk Eğitim. Bilimleri Dergisi. 2025;23(2):1931–54. 10.37217/tebd.1688486. [Google Scholar]
  • 84.Jeng M-Y, Pai F-Y, Yeh T-M. Antecedents for older adults’ intention to use smart health wearable devices-technology anxiety as a moderator. Behav Sci. 2022;12(4). 10.3390/bs12040114. [DOI] [PMC free article] [PubMed]
  • 85.Kong S-C, Cheung WM-Y, Zhang G. Evaluating an artificial intelligence literacy programme for developing university students’ conceptual Understanding, literacy, empowerment and ethical awareness. Educational Technol Soc. 2023;26(1):16–30. [Google Scholar]
  • 86.Sucu İ. The effect of artifiticial intelligence on society and artificial intelligence the view of artificial intelligence in the context of film (I.A). IJOTEM. 2019;2(2):203–15. [Google Scholar]
  • 87.Ismatullaev UVU, Kim S-H. Review of the factors affecting acceptance of AI-Infused systems. Hum Factors. 2022;66(1):126–44. 10.1177/00187208211064707. [DOI] [PubMed] [Google Scholar]
  • 88.Karaoglan Yilmaz FG, Yilmaz R. Exploring the role of self-regulated learnings skills, cognitive flexibility, and metacognitive awareness on generative artificial intelligence attitude. Innovations Educ Teach Int. 2025;62(5):1682–95. 10.1080/14703297.2025.2484613. [Google Scholar]
  • 89.Emon MMH, Khan T. The mediating role of attitude towards the technology in shaping artificial intelligence usage among professionals. Telematics Inf Rep. 2025;17:100188. 10.1016/j.teler.2025.100188. [Google Scholar]
  • 90.Korkmaz E. STEM awareness of pre-service teachers, attitudes towards technology and examining the relationship between 21st century skills. Karaman: Karamanoğlu Mehmetbey University; 2024. [Google Scholar]
  • 91.Bahroun Z, Anane C, Ahmed V, Zacca A. Transforming education. A comprehensive review of generative artificial intelligence in educational settin gs through bibliometric and content analysis. Sustain. 2023;15(17):12983. 10.3390/su151712983.
  • 92.Engelbrecht J, Borba MC. Recent developments in using digital technology in mathematics education. ZDM – Math Educ. 2024;56(2):281–92. 10.1007/s11858-023-01530-2. [Google Scholar]
  • 93.Drijvers P, Sinclair N. The role of digital technologies in mathematics education: purposes and perspectives. ZDM – Math Educ. 2024;56(2):239–48. 10.1007/s11858-023-01535-x. [Google Scholar]
  • 94.Darmanova Z, Abylkassymova A, Nurmukhamedova Z. A systematic review of technology use in middle and high school mathematics education: insights from contextual, methodological, and evaluation characteristics. Front Educ. 2025;10. 10.3389/feduc.2025.1644284.
  • 95.Voronin D, Saienko V, Tolchieva H, editors. Digital transformation of pedagogical education at the university. Digitalization of education: history, trends and prospects. Atlantis; 2020.
  • 96.Deng Y, Liu H. To overcome test anxiety in on-line assessment: unpacking the mediator roles of techno competencies, teacher support, self-efficacy, and autonomy. BMC Psychol. 2025;13(1):192. 10.1186/s40359-025-02545-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.Henderson J, Corry M. Teacher anxiety and technology change: a review of the literature. Technol Pedagogy Educ. 2021;30(4):573–87. 10.1080/1475939X.2021.1931426. [Google Scholar]
  • 98.Karaoglan Yilmaz FG, Yilmaz R, Ustun AB, Uzun H. Exploring the role of cognitive flexibility, digital competencies, and self-regulation skills on students’ generative artificial intelligence anxiety. Computers Hum Behavior: Artif Hum. 2025;5:100187. 10.1016/j.chbah.2025.100187. [Google Scholar]
  • 99.Lee Y-J, Oh J, Hong C. Exploratory research on Understanding university students’ artificial intelligence literacy in a Korean university. Online J Communication Media Technol. 2024;14(3):e202440. 10.30935/ojcmt/14711. [Google Scholar]
  • 100.İnci Kuzu C. Views of mathematics teacher candidates on the use of geogebra in probability teaching. Asian J Contemp Educ. 2021;5(1):45–56. 10.18488/journal.137.2021.51.45.56. [Google Scholar]
  • 101.Gürbüz C, The Digitalization of Vocational and Technical Education Skill Systems Journal of Education and Humanities. Theory Pract. 2024;15(29):199–222. 10.58689/eibd.1445906. [Google Scholar]
  • 102.Nakakoji Y, Wilson R. Interdisciplinary learning in mathematics and science: transfer of learning for 21st century problem solving at university. J Intell. 2020;8(3):32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 103.Özdemir U. Investigation of 8th grade students’ covariational reasoning in dynamic learning environment: A teaching experiment. Konya: Necmettin Erbakan University; 2025. [Google Scholar]
  • 104.Demir MK. Matematik öğretmenlerinin öğrencilerin bilgiyi yapılandırma sürecindeki rolünün incelenmesi. Balıkesir: Balikesir University; 2017. [Google Scholar]
  • 105.Tesfamicael SA, Ayalew Y. Mathematics education in Ethiopia in the era of COVID-19: boosting equitable access for all learners via opportunity to learning. Contemp Math Sci Educ. 2021;2(1):ep21005. 10.30935/conmaths/9680. [Google Scholar]
  • 106.Zhai C, Wibowo S, Li LD. The effects of over-reliance on AI dialogue systems on students’ cognitive abilities: a systematic review. Smart Learn Environ. 2024;11(1):28. 10.1186/s40561-024-00316-7. [Google Scholar]
  • 107.Li M. Exploring the digital divide in primary education: A comparative study of urban and rural mathematics teachers’ TPACK and attitudes towards technology integration in post-pandemic China. Educ Inform Technol. 2025;30(2):1913–45. 10.1007/s10639-024-12890-x. [Google Scholar]
  • 108.Nti-Asante E. Engaging students in making artificial intelligence tools for mathematics education: an iterative design approach. J Math Educ. 2024;17(1):16–37. 10.26711/007577152790172. [Google Scholar]
  • 109.Huang L. Ethics of artificial intelligence in education: student privacy and data protection. Sci Insights Educ Front. 2023;16:2577–87. 10.15354/sief.23.re202. [Google Scholar]
  • 110.Temur S. Using artificial intelligence in education: ethical issues and solutions. Mehmet Akif Ersoy Univ J Educ Fac. 2025(74):568–95. 10.21764/maeuefd.1516576.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data Availability Statement

The data that support the findings of this study are available from the author upon reasonable request.


Articles from BMC Psychology are provided here courtesy of BMC

RESOURCES