Abstract
This study examines how teacher digital technology anxiety influences digital literacy development through cognitive appraisal mechanisms. Employing a three-wave longitudinal design spanning one academic year, we surveyed 1,247 in-service teachers across urban, suburban, and rural schools in China at six-month intervals. Cross-lagged panel modeling revealed that digital technology anxiety significantly predicted subsequent decreases in digital literacy development, with this relationship demonstrating temporal precedence over reverse pathways. Mediation analyses grounded in cognitive appraisal theory indicated that approximately 65% of anxiety’s total effect operated through interpretive mechanisms, specifically threat appraisals and resource appraisals, which subsequently shaped coping strategies. Teachers experiencing heightened anxiety more readily interpreted technological demands as threatening to professional competence while perceiving fewer available coping resources, ultimately constraining skill acquisition. These findings extend cognitive appraisal theory into technology acceptance contexts and illuminate affective-cognitive mechanisms often overlooked in digital literacy development models. Practical implications suggest multilevel interventions targeting cognitive reappraisal training, resource provision, and psychologically safe organizational environments to support teachers’ digital professional development.
Supplementary Information
The online version contains supplementary material available at 10.1186/s40359-026-04317-8.
Keywords: Digital technology anxiety, Digital literacy development, Cognitive appraisal theory, Teacher professional development, Longitudinal study, Mediation analysis
Introduction
The accelerating integration of digital technologies into educational contexts has fundamentally reshaped pedagogical practices and professional expectations for teachers worldwide. While this transformation promises enhanced learning outcomes and increased instructional flexibility, it simultaneously presents formidable challenges that extend beyond mere technical proficiency [1]. Teachers now confront an evolving landscape where mastery of diverse digital tools, platforms, and pedagogical approaches becomes essential rather than optional. This shift has generated considerable psychological strain, particularly manifesting as digital technology anxiety—a phenomenon that warrants closer examination given its potential to impede rather than facilitate professional adaptation.
Digital technology anxiety among teachers extends beyond momentary discomfort with unfamiliar tools, reflecting deeper concerns about professional identity and pedagogical effectiveness in technology-mediated environments [2]. Such persistent apprehension may trigger behavioral patterns—including avoidance and constrained exploration—that ultimately impede the competency development contemporary education systems demand. This paradox merits careful consideration: anxiety about technology can generate a self-reinforcing cycle that hinders teachers from acquiring the competencies necessary to alleviate that anxiety.
Digital literacy has emerged as a cornerstone of teacher professional development in the 21st century, encompassing not only operational proficiency but also critical evaluation, creative application, and ethical consideration of digital resources [3]. For teachers, robust digital literacy translates into enhanced instructional design capabilities, improved student engagement strategies, and greater adaptability to evolving educational technologies. Yet despite growing recognition of digital literacy’s importance, we observe significant variation in how teachers develop these competencies, suggesting that psychological factors may mediate this developmental trajectory in ways that current research inadequately addresses.
Existing scholarship has documented correlations between technology anxiety and digital competence levels, but the mechanisms through which anxiety influences literacy development remain underexplored [4]. Particularly absent from the literature are rigorous examinations of the cognitive processes teachers employ when encountering anxiety-provoking digital scenarios. How do teachers appraise technological challenges? What interpretive frameworks shape their emotional and behavioral responses? How do these appraisals evolve over time, and with what consequences for skill acquisition? These questions point toward a theoretical gap that cognitive appraisal theory—originally developed to explain stress responses but increasingly applied to educational contexts—seems uniquely positioned to address.
Cognitive appraisal theory posits that individuals’ emotional and behavioral reactions stem not from situations themselves but from how those situations are evaluated [5]. Primary appraisal involves assessing whether a situation poses threat or opportunity, while secondary appraisal concerns perceived coping resources and response options. Applied to teacher digital technology anxiety, this framework suggests that how teachers interpret technological demands and evaluate their capacity to meet those demands may critically shape both anxiety levels and subsequent literacy development. Longitudinal investigation becomes essential here, as cross-sectional designs cannot capture the dynamic, reciprocal relationships between appraisal processes, anxiety fluctuations, and competency growth over time.
This study addresses these gaps by examining how teacher digital technology anxiety influences digital literacy development through cognitive appraisal mechanisms, utilizing a three-wave longitudinal design spanning one academic year. We pose three primary research questions: (1) How does digital technology anxiety predict changes in teachers’ digital literacy over time? (2) What roles do primary and secondary appraisals play in mediating this relationship? (3) Do these relationships vary across different dimensions of digital literacy?
Our investigation offers three distinctive contributions to the field. First, this study employs a longitudinal panel design that establishes clear temporal sequencing among variables, enabling stronger causal inference than cross-sectional approaches prevalent in existing literature. Second, we extend cognitive appraisal theory into the domain of teacher digital literacy development, providing a theoretical lens that has received limited attention in educational technology research. Third, by specifying the psychological mechanisms through which anxiety shapes competency growth, we move beyond documenting correlations to explaining underlying processes. These contributions collectively advance both theoretical understanding and practical intervention design for supporting teachers’ digital professional development.
The remainder of this paper proceeds as follows. Section II reviews relevant literature on digital technology anxiety, digital literacy development, and cognitive appraisal theory. Section III details our longitudinal research methodology. Section IV presents analytical results. Section V discusses findings, theoretical implications, practical recommendations, and limitations. Section VI presents conclusions including theoretical contributions and practical recommendations organized across individual, institutional, and policy levels.
Literature review and theoretical foundation
Current research status on teacher digital technology anxiety
Digital technology anxiety, as a psychological construct, has evolved considerably since its initial conceptualization in the context of computer anxiety during the 1980s. Contemporary definitions frame it as a multidimensional affective state characterized by tension, apprehension, and worry when individuals anticipate or engage with digital technologies [6]. Unlike transient nervousness, this form of anxiety exhibits relative stability across situations and manifests through cognitive, emotional, and behavioral components that collectively impair technology engagement. Researchers have refined measurement approaches over time, moving from unidimensional scales focused primarily on computer avoidance to more nuanced instruments capturing distinct facets such as performance anxiety, learning anxiety, and social anxiety related to technology use [7].
Among teacher populations, digital technology anxiety assumes particular significance given its direct implications for instructional quality and professional adaptation. Teachers experiencing heightened anxiety often report cognitive interference during technology-mediated instruction, including intrusive thoughts about potential technical failures, concerns about appearing incompetent before students, and rumination over lost instructional time [8]. Behaviorally, anxious teachers tend to minimize technology integration, defaulting to familiar pedagogical approaches even when digital tools might better serve learning objectives. Some educators develop compensatory strategies—overpreparing lesson materials, avoiding spontaneous technology use, or delegating technical tasks to more confident colleagues—that temporarily reduce anxiety but prevent skill development.
The sources of teacher digital technology anxiety appear multifaceted and context-dependent. Individual-level factors include prior negative experiences with technology, low perceived self-efficacy, and personality traits such as perfectionism or fear of negative evaluation [9]. Institutional conditions contribute substantially as well: inadequate professional development, insufficient technical support, and organizational cultures that emphasize technology adoption without acknowledging implementation challenges can exacerbate anxiety levels [10]. Research on school context variables has further demonstrated that such organizational stressors negatively affect teacher job satisfaction and sustained motivation, compounding the psychological burden associated with technology integration demands [56]. Generational differences emerge in some studies, though findings remain inconsistent, with younger teachers not uniformly exhibiting lower anxiety than their senior counterparts. This variability suggests that age serves as a proxy for more complex factors like exposure intensity, training quality, and personal learning orientations.
How this anxiety shapes actual teaching practice has garnered increasing empirical attention. Teachers reporting elevated anxiety demonstrate reduced likelihood of integrating innovative digital tools, less frequent use of technology for formative assessment, and diminished willingness to experiment with student-centered digital pedagogies [11]. The relationship appears reciprocal rather than unidirectional: poor implementation experiences reinforce anxiety, creating cycles difficult to interrupt through conventional professional development alone. Some evidence suggests that anxiety impacts not just technology adoption rates but also the pedagogical sophistication of technology use, with anxious teachers more likely to employ digital tools for presentation rather than interactive or collaborative purposes [12].
Research has documented meaningful connections between digital technology anxiety and several aspects of teacher professional functioning. Teachers reporting elevated anxiety demonstrate reduced technology integration in classroom practice, often reverting to traditional pedagogical methods even when digital tools might enhance student engagement [11]. The relationship between anxiety and Technological Pedagogical Content Knowledge (TPACK) development appears particularly consequential; anxious teachers show slower growth in integrating technological, pedagogical, and content knowledge dimensions [2]. Professional learning engagement also suffers, as teachers experiencing technology-related apprehension participate less actively in digital skills training and peer collaboration focused on technology integration [10]. These patterns suggest that digital technology anxiety functions as a barrier not merely to technology use but to broader professional development trajectories.
Throughout this paper, we use “digital technology anxiety” as our primary construct, distinguishing it from related concepts. While “technostress” encompasses broader work-related strain from technology use including overload and invasion dimensions [6], our focus centers specifically on the anxiety component characterized by apprehension and worry when anticipating or engaging with digital technologies. “Technology anxiety” serves as a general umbrella term, whereas “digital technology anxiety” captures the contemporary educational technology context more precisely.
Despite accumulating evidence, several critical gaps constrain current understanding of digital technology anxiety and its developmental consequences. Most research employs cross-sectional designs that preclude causal inference or examination of anxiety trajectories over time. Whether anxiety naturally diminishes with exposure, remains stable, or fluctuates in response to specific triggers remains largely unknown. The mechanisms through which anxiety influences professional learning and competency development receive limited theoretical attention; correlational findings dominate while process-oriented investigations remain scarce. Furthermore, existing studies often treat anxiety as a predictor variable without adequately considering how cognitive interpretations of anxiety-provoking situations might moderate or mediate its effects [13]. The field lacks longitudinal evidence establishing temporal precedence between anxiety and competency outcomes, leaving directionality questions unresolved.
These limitations point toward the necessity of longitudinal research designs capable of tracking anxiety dynamics and competency changes simultaneously. Theoretical frameworks that specify psychological mechanisms—particularly those explaining how individuals interpret and respond to stressful encounters—offer promising directions for advancing beyond descriptive accounts toward explanatory models. The cognitive appraisal perspective, with its emphasis on subjective evaluation processes, may provide conceptual scaffolding for understanding why teachers with similar anxiety levels exhibit divergent developmental trajectories in digital literacy.
Review of research on teacher digital literacy development
The conceptualization of digital literacy has undergone substantial transformation since its emergence as a descriptor of basic computer skills in the 1990s. Early frameworks emphasized technical proficiency—the ability to operate hardware and software—but subsequent iterations have broadened considerably to encompass critical thinking, creative production, and ethical engagement with digital environments [14]. Contemporary models recognize digital literacy as a multidimensional construct integrating cognitive, technical, and social competencies that enable individuals to navigate increasingly complex information ecosystems. For educators specifically, this evolution reflects growing awareness that teaching in digital contexts demands more than tool mastery; it requires pedagogical reimagination and ongoing adaptive capacity.
Teacher digital literacy frameworks typically distinguish several interconnected dimensions, though consensus on precise boundaries remains elusive. Technical-operational competencies form a foundational layer, encompassing proficiency with educational technologies, learning management systems, and digital content creation tools [15]. Moving beyond mechanics, information literacy involves locating, evaluating, and synthesizing digital resources—skills particularly crucial given the proliferation of dubious online materials that teachers and students encounter daily. Pedagogical dimensions address how teachers integrate technology to enhance learning experiences, design digitally-mediated assessments, and facilitate collaborative knowledge construction [16]. Some frameworks incorporate ethical-critical dimensions, emphasizing data privacy awareness, digital citizenship modeling, and critical evaluation of technology’s societal implications.
Assessment approaches for teacher digital literacy mirror this dimensional complexity. Self-report instruments dominate empirical research, offering efficiency but raising concerns about response bias and the gap between perceived and actual competence. Performance-based assessments provide more direct evidence of capability but face practical constraints around time, cost, and standardization across diverse technological contexts [17]. Portfolio methods capture authentic practice but introduce interpretation challenges. The field continues to grapple with whether digital literacy should be assessed as decontextualized skills or as situated practices inseparable from specific pedagogical contexts—a tension that influences both measurement choices and professional development design.
Multiple factors shape teachers’ digital literacy development trajectories, operating across individual, organizational, and broader environmental levels. At the individual level, prior technology experiences, personal learning orientations, and self-efficacy beliefs exert considerable influence on engagement with professional learning opportunities and willingness to experiment with unfamiliar tools [18]. Age and career stage show complex relationships with literacy development; while some studies suggest younger teachers possess advantages, others find that teaching experience and pedagogical expertise enable more sophisticated technology integration once basic skills are acquired. Motivation emerges as a critical mediator—teachers who perceive technology as genuinely beneficial for their students invest more effort in developing relevant competencies than those viewing digital tools as administrative impositions.
Organizational factors frequently determine whether individual motivation translates into actual skill development. Schools providing sustained professional development, accessible technical support, and collaborative learning cultures facilitate literacy growth more effectively than those offering sporadic training sessions disconnected from classroom realities [19]. Leadership matters considerably: principals who model technology use, allocate resources strategically, and buffer teachers from implementation pressures create conditions where experimentation feels safe rather than risky. Peer networks within schools serve as crucial learning resources, enabling teachers to observe colleagues’ practices, troubleshoot challenges collectively, and develop shared understandings of effective technology integration.
Environmental influences operating at policy and societal levels shape both the urgency and nature of digital literacy development. Curricular mandates, accountability systems, and infrastructure investments signal institutional priorities that teachers must navigate [20]. The COVID-19 pandemic starkly illustrated how external shocks can accelerate development demands; teachers faced with sudden remote instruction requirements demonstrated remarkable adaptability, though at considerable psychological cost. Broader cultural attitudes toward technology in education—whether characterized by techno-optimism, skepticism, or pragmatic evaluation—permeate teachers’ professional identities and development trajectories.
Research examining temporal patterns in digital literacy development reveals considerable heterogeneity rather than uniform progression. Some teachers exhibit rapid initial growth followed by plateaus; others show gradual steady improvement; still others demonstrate non-linear patterns with periods of advancement and regression [21]. These varied trajectories suggest that literacy development is neither automatic nor inevitable—it depends on sustained engagement, supportive conditions, and perhaps most importantly, psychological readiness to persist through challenges.
Yet despite growing sophistication in mapping digital literacy’s contours and influences, a conspicuous gap persists. Existing research overwhelmingly emphasizes cognitive and contextual factors while inadequately addressing emotional dimensions of the development process. How do feelings of anxiety, frustration, or inadequacy shape teachers’ engagement with learning opportunities? When do emotional responses facilitate growth versus triggering avoidance? These questions remain underexplored, leaving us with incomplete understanding of why teachers facing similar circumstances and resources develop such divergent competency profiles.
Cognitive appraisal theory and its application
Cognitive appraisal theory, originally formulated to explain individual differences in stress responses, offers a nuanced framework for understanding why people react divergently to objectively similar situations. The theory’s central premise challenges stimulus-response models by positing that emotional and behavioral reactions stem not from environmental demands themselves but from how individuals interpret those demands [22]. This interpretive process occurs through two sequential yet interactive evaluation stages. Primary appraisal involves assessing whether a situation is irrelevant, benign-positive, or stressful—and if stressful, whether it constitutes harm/loss, threat, or challenge. Secondary appraisal concerns evaluating available coping resources, options for action, and likelihood of successfully managing the situation. Together, these appraisals determine emotional intensity, physiological arousal, and behavioral responses.
What makes this framework particularly valuable is its recognition that appraisals are neither static nor purely cognitive. They fluctuate as situations unfold and individuals gain new information or reinterpret existing circumstances. Moreover, appraisals incorporate motivational elements—personal goals, commitments, and beliefs about what matters—that infuse supposedly objective evaluations with subjective significance. A technological challenge that one teacher appraises as threatening (potentially exposing inadequacy) might be viewed by another as an energizing opportunity (chance to master new skills). These divergent appraisals then trigger different coping strategies: problem-focused efforts aimed at changing the situation, emotion-focused attempts to regulate affective responses, or avoidance behaviors that minimize engagement with the stressor.
Applications of cognitive appraisal theory have proliferated well beyond its original stress research domain. In technology acceptance contexts, researchers have employed the framework to explain why individuals with similar technical skills exhibit vastly different adoption patterns [23]. Those appraising new technologies as threats requiring capabilities they lack tend to resist adoption, while those viewing the same technologies as manageable challenges engage more readily. The theory has proven especially illuminating for understanding technology-induced anxiety, clarifying how threat appraisals amplify anxiety while resource appraisals modulate it. Studies examining workplace technology transitions find that appraisal patterns predict not just initial reactions but sustained engagement over time—precisely the longitudinal dynamics relevant to competency development.
Within educational settings, cognitive appraisal theory helps explain teachers’ varied responses to professional learning demands and pedagogical innovations [24]. Teachers facing identical professional development opportunities make different appraisals based on their prior experiences, self-efficacy beliefs, and perceived organizational support. Those appraising the learning challenge as exceeding their coping resources experience greater anxiety and engage less persistently, while those judging resources as adequate invest more effort and demonstrate greater skill acquisition. Importantly, these patterns appear recursive: initial appraisals shape engagement, which produces experiences that inform subsequent appraisals, creating either virtuous or vicious cycles.
Applications of cognitive appraisal theory to teacher professional development have yielded valuable insights. Research on curriculum reform implementation demonstrates that teachers who appraise new requirements as challenges rather than threats show greater willingness to modify instructional practices [24]. Studies examining teacher burnout reveal that appraisal patterns mediate the relationship between job demands and emotional exhaustion, with resource appraisals serving protective functions [25]. In technology integration contexts specifically, teachers’ appraisals of their capacity to master new tools predict both initial adoption decisions and sustained implementation efforts [23]. These applications establish theoretical continuity between general stress research and educational contexts, providing foundation for examining how appraisal processes operate in digital literacy development.
Drawing on this theoretical foundation, we propose a model wherein teacher digital technology anxiety influences digital literacy development through cognitive appraisal mechanisms. Specifically, we hypothesize that anxiety does not directly impede literacy growth but rather operates through interpretive pathways. When teachers experience digital technology anxiety, this affective state colors their primary appraisal of technological demands, increasing the likelihood of threat interpretations (perceiving digital tasks as potentially harmful to professional identity or efficacy) rather than challenge interpretations (viewing them as opportunities for growth) [26]. These threat appraisals then cascade into secondary appraisals, where anxious teachers underestimate their coping resources, doubting their capacity to master new tools or recover from implementation setbacks.
The appraisal cascade shapes subsequent coping strategies, which directly impact learning engagement and skill development. Teachers making threat appraisals and perceiving inadequate resources tend toward emotion-focused or avoidance coping—minimizing technology use, delegating technical tasks, or experiencing cognitive interference during digital instruction [25]. Such coping patterns limit practice opportunities and reduce the experiential learning necessary for literacy development. Conversely, teachers appraising situations as challenges with adequate resources employ problem-focused coping, actively seeking learning opportunities, experimenting with new approaches, and persisting through difficulties—behaviors that accelerate competency growth.
Based on this theoretical integration, we propose a conceptual model and advance formal hypotheses for empirical testing.
Research framework and hypotheses
Figure 1 presents the theoretical framework guiding this investigation. The model posits that digital technology anxiety initiates a cognitive appraisal cascade that ultimately shapes digital literacy development. Specifically, anxiety at Time 1 influences how teachers interpret technological demands through primary appraisal (threat versus challenge perceptions) and evaluate their capacity to respond through secondary appraisal (resource availability perceptions) at Time 2. These appraisals then determine coping strategy selection, which directly affects learning engagement and skill development observed at Time 3.
Fig. 1.
Theoretical framework of digital technology anxiety’s influence on digital literacy development through cognitive appraisal mechanisms
This framework contributes to existing scholarship in several ways. Theoretically, it extends cognitive appraisal theory beyond stress and health domains into educational technology contexts, specifying how interpretive processes shape professional competency development. Methodologically, the three-wave longitudinal design enables examination of temporal sequencing that cross-sectional studies cannot provide. Practically, by identifying cognitive mechanisms underlying anxiety’s effects, the framework points toward intervention targets beyond simple exposure or skills training.
We advance three core hypotheses:
H1: Teacher digital technology anxiety at baseline will negatively predict digital literacy development over time, demonstrating temporal precedence over reverse pathways.
H2: The relationship between digital technology anxiety and digital literacy development will be mediated by cognitive appraisal processes, specifically (a) threat appraisals and (b) resource appraisals, with anxiety increasing threat perceptions and decreasing perceived coping resources.
H3: Cognitive appraisals will predict subsequent coping strategies, which in turn will influence digital literacy development trajectories.
Testing these hypotheses through longitudinal design allows examination of not merely whether anxiety and literacy correlate but how cognitive interpretive processes link them dynamically across time.
Research design and methods
Research design and sample
This study employed a three-wave longitudinal panel design to examine dynamic relationships between teacher digital technology anxiety, cognitive appraisals, and digital literacy development. We selected six-month intervals between measurement waves, a temporal spacing that balances several methodological considerations [27]. This interval proves sufficiently long to detect meaningful changes in digital literacy competencies while remaining short enough to minimize memory distortions in self-reported appraisal processes. Moreover, the six-month gap aligns with natural academic semester rhythms, reducing the likelihood that seasonal variations in teaching demands would confound our findings.
Sample recruitment followed a stratified cluster sampling approach designed to capture diversity across school contexts while maintaining practical feasibility [28]. We initially contacted educational bureaus in three provinces representing distinct economic development levels in China: Jiangsu Province (eastern region, high economic development), Henan Province (central region, moderate economic development), and Gansu Province (western region, lower economic development). This provincial selection strategy ensured representation across China’s regional economic spectrum, as development levels influence both technology infrastructure availability and professional development resources [29]. We secured cooperation from 68 schools across urban (n = 32), suburban (n = 21), and rural (n = 15) areas, encompassing 24 primary schools, 28 middle schools, and 16 high schools.
Within each participating school, we invited all in-service teachers who regularly incorporated digital technologies into their instruction to participate. Digital technologies were operationally defined as electronic tools and platforms used for educational purposes, including but not limited to: learning management systems (e.g., Moodle, Blackboard), multimedia presentation software (e.g., PowerPoint, Prezi), online collaboration platforms (e.g., Tencent Meeting, DingTalk), educational applications (e.g., Rain Classroom, Xuexitong), interactive whiteboards, and digital content creation tools. The inclusion criterion required using such technologies at least once weekly in instructional activities. This criterion ensured that participants had sufficient technology engagement to experience meaningful anxiety and to exhibit observable literacy development.
As shown in Fig. 2, the first wave (T1) occurred in September 2023, coinciding with the start of the academic year when teachers typically engage with new technological initiatives. The second wave (T2) was administered in March 2024, and the third wave (T3) in September 2024. At T1, we distributed surveys to 1,856 teachers, receiving 1,683 valid responses (response rate: 90.7%). Subsequent waves presented greater challenges: T2 yielded 1,429 valid responses from the original cohort, while T3 obtained 1,247 responses.
Fig. 2.
Three-wave longitudinal data collection process
These numbers reflect an overall attrition rate calculated using the standard formula:
![]() |
1 |
Applying this formula yields an attrition rate of 25.9%, falling within acceptable ranges for educational longitudinal research [30].
Table 1 presents detailed demographic characteristics of our sample across all three waves. As presented in Table 1, the sample comprised 68.3% female and 31.7% male teachers, broadly reflecting gender distributions in China’s teaching workforce. Age distribution showed concentration in the 30–45 years range (61.4%), with teaching experience averaging 12.6 years. Educational attainment skewed toward bachelor’s degrees (72.8%), while school types and teaching levels represented diverse contexts from primary through high school settings.
Table 1.
Demographic characteristics of research sample across three waves
| Variable | T1 (n = 1,683) | T2 (n = 1,429) | T3 (n = 1,247) | Overall (n = 1,247) |
|---|---|---|---|---|
| Gender | ||||
| Male | 534 (31.7%) | 449 (31.4%) | 391 (31.4%) | 391 (31.4%) |
| Female | 1,149 (68.3%) | 980 (68.6%) | 856 (68.6%) | 856 (68.6%) |
| Age | ||||
| Under 30 | 286 (17.0%) | 238 (16.7%) | 204 (16.4%) | 204 (16.4%) |
| 30–45 | 1,034 (61.4%) | 879 (61.5%) | 768 (61.6%) | 768 (61.6%) |
| Over 45 | 363 (21.6%) | 312 (21.8%) | 275 (22.0%) | 275 (22.0%) |
| Teaching Experience | ||||
| 0–5 years | 371 (22.1%) | 309 (21.6%) | 265 (21.2%) | 265 (21.2%) |
| 6–15 years | 758 (45.0%) | 646 (45.2%) | 567 (45.5%) | 567 (45.5%) |
| Over 15 years | 554 (32.9%) | 474 (33.2%) | 415 (33.3%) | 415 (33.3%) |
| Education Level | ||||
| Associate degree | 219 (13.0%) | 183 (12.8%) | 157 (12.6%) | 157 (12.6%) |
| Bachelor’s degree | 1,225 (72.8%) | 1,042 (72.9%) | 908 (72.8%) | 908 (72.8%) |
| Master’s or above | 239 (14.2%) | 204 (14.3%) | 182 (14.6%) | 182 (14.6%) |
| School Type | ||||
| Urban | 897 (53.3%) | 763 (53.4%) | 668 (53.6%) | 668 (53.6%) |
| Suburban | 463 (27.5%) | 392 (27.4%) | 339 (27.2%) | 339 (27.2%) |
| Rural | 323 (19.2%) | 274 (19.2%) | 240 (19.2%) | 240 (19.2%) |
| Teaching Level | ||||
| Primary | 618 (36.7%) | 524 (36.7%) | 457 (36.6%) | 457 (36.6%) |
| Middle | 687 (40.8%) | 585 (40.9%) | 512 (41.1%) | 512 (41.1%) |
| High | 378 (22.5%) | 320 (22.4%) | 278 (22.3%) | 278 (22.3%) |
To assess whether attrition introduced systematic bias, we conducted Little’s MCAR test and compared demographic characteristics between retained participants and those lost to follow-up [31]. Results indicated no significant differences across gender, age, teaching experience, or baseline anxiety levels (all p > .05), suggesting that attrition occurred randomly rather than systematically. This finding enhances confidence that our longitudinal analyses reflect genuine developmental processes rather than sample selection artifacts.
Research instruments and measurement
We measured digital technology anxiety using an adapted version of the Computer Anxiety Rating Scale, modified for contemporary educational technology contexts [32]. The instrument comprises 15 items distributed across three dimensions: cognitive anxiety (reflecting intrusive thoughts and worry about technology use), affective anxiety (capturing emotional distress during technology encounters), and behavioral anxiety (assessing avoidance tendencies and performance impairment). Sample items include “I worry about making mistakes when using educational technology in class” and “I feel tense when I need to learn new digital tools.” Respondents rated each item on a 7-point Likert scale ranging from 1 (strongly disagree) to 7 (strongly agree), with higher scores indicating greater anxiety levels.
Digital literacy assessment drew upon multidimensional frameworks recognizing both technical and pedagogical competencies. We employed a 20-item scale encompassing four domains: technical-operational literacy (6 items, e.g., “I can troubleshoot common technical problems independently”), information literacy (5 items, e.g., “I critically evaluate online resources before using them in teaching”), pedagogical-digital literacy (5 items, e.g., “I design learning activities that effectively integrate digital tools”), and creative-innovative literacy (4 items, e.g., “I create original digital content to support student learning”) [33]. Teachers rated their competence on each item using a 7-point scale from 1 (not competent at all) to 7 (highly competent).
To assess cognitive appraisal processes, we adapted established measures from stress and coping literature to the technology context. Primary appraisal was captured through 8 items distinguishing threat perceptions (4 items) from challenge perceptions (4 items). Secondary appraisal employed 6 items measuring perceived coping resources, including self-efficacy, available support, and time adequacy [34]. Coping strategies were measured using 12 items across three approaches: problem-focused coping (4 items), emotion-focused coping (4 items), and avoidance coping (4 items).
Control variables included demographic characteristics (gender, age, teaching experience) and contextual factors (school type, teaching level) that prior research identified as potential confounds. We also controlled for baseline digital literacy levels when examining longitudinal change, ensuring that observed effects reflected genuine development rather than initial differences.
Reliability assessment employed Cronbach’s alpha coefficient, calculated using the standard formula:
![]() |
2 |
where
represents the number of items,
denotes variance of item
, and
indicates total scale variance.
Table 2 summarizes measurement properties across all three waves. As presented in Table 2, Cronbach’s alpha coefficients consistently exceeded 0.85 for all constructs across time points, demonstrating excellent internal consistency.
Table 2.
Measurement instruments and reliability-validity results
| Variable | Source | Items | Cronbach’s α (T1) | Factor Loadings Range | AVE | CR | CFA Fit Indices |
|---|---|---|---|---|---|---|---|
| Digital Technology Anxiety | Adapted from [32] | 15 | 0.91 | 0.68-0.85 | 0.58 | 0.91 | χ²/df = 2.34, CFI=0.96, TLI=0.95, RMSEA=0.05, SRMR=0.04 |
| - Cognitive Anxiety | 5 | 0.87 | 0.71-0.82 | 0.56 | 0.86 | ||
| - Affective Anxiety | 5 | 0.89 | 0.73-0.85 | 0.61 | 0.89 | ||
| - Behavioral Anxiety | 5 | 0.85 | 0.68-0.79 | 0.53 | 0.85 | ||
| Digital Literacy | Adapted from [33] | 20 | 0.93 | 0.69-0.87 | 0.57 | 0.93 | χ²/df = 2.18, CFI=0.97, TLI=0.96, RMSEA=0.04, SRMR=0.03 |
| - Technical-Operational | 6 | 0.88 | 0.72-0.84 | 0.55 | 0.88 | ||
| - Information Literacy | 5 | 0.86 | 0.69-0.81 | 0.54 | 0.86 | ||
| - Pedagogical-Digital | 5 | 0.90 | 0.75-0.87 | 0.62 | 0.90 | ||
| - Creative-Innovative | 4 | 0.87 | 0.74-0.83 | 0.58 | 0.87 | ||
| Threat Appraisal | Adapted from [34] | 4 | 0.88 | 0.76-0.86 | 0.63 | 0.87 | χ²/df = 1.89, CFI=0.98, TLI=0.97, RMSEA=0.03, SRMR=0.02 |
| Resource Appraisal | Adapted from [34] | 6 | 0.90 | 0.72-0.88 | 0.60 | 0.90 | |
| Coping Strategies | Adapted from [34] | 12 | 0.86 | 0.65-0.82 | 0.52 | 0.87 |
AVE Average Variance Extracted, CR Composite Reliability
All factor loadings significant at p < .001
Discriminant validity was established as the square root of AVE for each construct exceeded its correlations with other constructs
Confirmatory factor analysis evaluated structural validity using multiple fit indices:
![]() |
3 |
where
should be below 3, CFI and TLI above 0.90, RMSEA below 0.08, and SRMR below 0.08 for acceptable fit [35].
Convergent validity was supported by Average Variance Extracted (AVE) values exceeding the 0.50 threshold for all constructs, ranging from 0.52 to 0.63 [35]. Composite Reliability (CR) values ranged from 0.85 to 0.93, surpassing the recommended 0.70 criterion. Factor loadings for individual items ranged from 0.65 to 0.88, all statistically significant at p < .001. Discriminant validity was established through comparison of AVE square roots with inter-construct correlations; for all construct pairs, the square root of AVE exceeded the corresponding correlation coefficient, indicating adequate discrimination between constructs.
The results in Table 2 indicate excellent model fit across all constructs. Furthermore, measurement invariance testing across the three time points confirmed configural, metric, and scalar invariance (ΔCFI < 0.01), ensuring that construct meanings remained stable throughout the study period and enabling meaningful longitudinal comparisons.
Data analysis strategy
Data preprocessing began with systematic examination of response patterns to identify potential quality issues. We excluded cases demonstrating straight-lining (identical responses across all items), implausible response times (completing the survey in under three minutes), or excessive missing data (more than 20% of items unanswered). For remaining missing values—which constituted less than 2% of the total dataset—we employed multiple imputation procedures generating five imputed datasets, with final parameter estimates pooled across imputations following Rubin’s rules. Outliers were identified using the Mahalanobis distance criterion, though none exceeded the critical threshold requiring removal.
Common method bias posed a potential concern given our reliance on self-report measures. We implemented both procedural and statistical remedies to address this issue. Procedurally, we emphasized response confidentiality, used varied response formats across constructs, and temporally separated measurements of predictor and outcome variables. Statistically, Harman’s single-factor test revealed that no single factor accounted for more than 35% of variance, suggesting common method bias did not severely compromise our findings [36].
Table 3 summarizes our analytical approach across multiple levels of sophistication. Table 3 shows the progression from basic descriptive analyses through complex modeling techniques, each serving distinct purposes in testing our theoretical framework.
Table 3.
Summary of data analysis methods and statistical techniques
| Analysis Level | Specific Methods | Statistical Techniques | Analysis Purpose |
|---|---|---|---|
| Descriptive Analysis | Means, standard deviations, frequency distributions | SPSS 26.0 descriptive statistics | Characterize sample and variable distributions |
| Correlation Analysis | Pearson correlations, partial correlations | Bivariate and partial correlation matrices | Examine associations among variables across time |
| Cross-Lagged Analysis | Cross-lagged panel model (CLPM) | Structural equation modeling in Mplus 8.3 | Test temporal precedence and bidirectional effects |
| Mediation Analysis | Bootstrap mediation with parallel mediators | Bias-corrected bootstrap (5,000 samples) in PROCESS macro | Examine indirect effects through appraisal mechanisms |
| Robustness Tests | Alternative model specifications, subgroup analyses | Multi-group SEM, sensitivity analyses | Verify findings across different specifications |
Cross-lagged panel modeling served as our primary analytical strategy for examining temporal relationships between digital technology anxiety and digital literacy. In our analyses, we used manifest scale means (computed as the average of item scores for each construct) rather than latent constructs, following recommendations for panel models with established measurement properties and adequate sample sizes [37]. This approach enhances model parsimony while maintaining interpretability, given that our confirmatory factor analyses confirmed strong measurement properties across all time points.
We selected the traditional Cross-Lagged Panel Model (CLPM) rather than the Random Intercept Cross-Lagged Panel Model (RI-CLPM) for several methodological and theoretical reasons. First, our primary research interest lies in between-person differences in how anxiety levels predict subsequent literacy development across the teacher population, rather than within-person fluctuations from individual baselines [38]. Second, our theoretical framework derived from cognitive appraisal theory emphasizes how teachers’ characteristic anxiety levels shape their interpretive processes and developmental trajectories—a between-person phenomenon. Third, preliminary analyses indicated that RI-CLPM models encountered convergence difficulties with our data structure, potentially due to the relatively short time intervals and high construct stability observed. We acknowledge that RI-CLPM offers advantages for separating trait-like stability from state-like fluctuations, and we address this as a limitation warranting future investigation with designs optimized for within-person analyses.
Given that participants were nested within 68 schools, we assessed potential clustering effects by calculating intraclass correlation coefficients (ICCs) for primary outcome variables. ICCs ranged from 0.04 to 0.08 for digital technology anxiety and 0.05 to 0.09 for digital literacy across measurement waves, indicating that approximately 4–9% of variance resided at the school level. While these values suggest modest clustering effects, we employed cluster-robust standard errors in all structural equation models to account for non-independence of observations within schools [39]. Sensitivity analyses using multilevel structural equation modeling produced substantively identical results, supporting the robustness of our findings to alternative analytic approaches.
The basic cross-lagged model can be expressed as:
![]() |
4 |
![]() |
5 |
where
represents digital literacy at time
,
denotes digital technology anxiety at time
,
and
capture autoregressive effects,
and
represent cross-lagged effects, and
and
are error terms. This specification allowed us to determine whether anxiety predicted subsequent literacy changes beyond literacy’s own temporal stability, and conversely, whether literacy predicted anxiety changes.
Mediation analysis tested our theoretical assertion that cognitive appraisal processes explain how anxiety influences literacy development. We constructed parallel mediation models incorporating threat appraisal and resource appraisal as simultaneous mediators, using bias-corrected bootstrap procedures with 5,000 resamples to generate confidence intervals for indirect effects [40]. The indirect effect through a given mediator can be expressed as:
![]() |
6 |
where
represents the path from anxiety to the mediator (appraisal), and
indicates the path from the mediator to digital literacy. Confidence intervals excluding zero would indicate significant mediation.
Throughout all analyses, we controlled for demographic variables (gender, age, teaching experience) and contextual factors (school type, teaching level) by including them as covariates in structural models. Analyses were conducted using SPSS 26.0 for preliminary descriptive statistics, Mplus 8.3 for structural equation modeling and cross-lagged analyses, and the PROCESS macro (Model 4 for simple mediation, Model 6 for parallel mediation) in SPSS for mediation testing. Maximum likelihood estimation with robust standard errors accommodated minor deviations from multivariate normality. Model fit evaluation employed the same criteria described in Sect. 3.2, with multiple fit indices considered jointly rather than relying on any single metric.
Research results and analysis
Descriptive statistics and correlation analysis
Preliminary examination of variable distributions across the three measurement waves revealed patterns consistent with our theoretical expectations. Table 4 presents comprehensive descriptive statistics and intercorrelations for all focal variables. As presented in Table 4, digital technology anxiety demonstrated mean scores of 3.84 (SD = 1.12) at T1, 3.67 (SD = 1.15) at T2, and 3.52 (SD = 1.18) at T3, suggesting a gradual decline over the 12-month study period. Digital literacy showed an inverse temporal pattern, with means increasing from 4.23 (SD = 0.89) at T1 to 4.51 (SD = 0.92) at T2 and 4.76 (SD = 0.95) at T3. Skewness and kurtosis values for all variables remained within acceptable ranges (|skewness| < 2.0, |kurtosis| < 7.0), indicating that distributions approximated normality sufficiently for subsequent parametric analyses [41].
Table 4.
Descriptive statistics and correlations among study variables
| Variable | M | SD | 1 | 2 | 3 | 4 | 5 | 6 |
|---|---|---|---|---|---|---|---|---|
| 1. Anxiety_T1 | 3.84 | 1.12 | - | |||||
| 2. Anxiety_T2 | 3.67 | 1.15 | 0.68*** | - | ||||
| 3. Anxiety_T3 | 3.52 | 1.18 | 0.61*** | 0.71*** | - | |||
| 4. Literacy_T1 | 4.23 | 0.89 | − 0.52*** | − 0.41*** | − 0.37*** | - | ||
| 5. Literacy_T2 | 4.51 | 0.92 | − 0.44*** | − 0.56*** | − 0.45*** | 0.69*** | - | |
| 6. Literacy_T3 | 4.76 | 0.95 | − 0.39*** | − 0.47*** | − 0.58*** | 0.64*** | 0.73*** | - |
N = 1,247. ***p < .001, **p < .01, *p < .05
Paired-samples t-tests confirmed that both anxiety and literacy exhibited statistically significant changes across adjacent time points. Digital technology anxiety decreased significantly from T1 to T2 (t = 4.82, p < .001, Cohen’s d = 0.15) and from T2 to T3 (t = 4.19, p < .001, Cohen’s d = 0.13), though effect sizes indicated relatively modest magnitudes. Digital literacy increased significantly across both intervals (T1 to T2: t = -9.35, p < .001, d = 0.31; T2 to T3: t = -8.67, p < .001, d = 0.27), with somewhat larger effect sizes suggesting more pronounced developmental trajectories [42].
Figure 3 illustrates the contrasting developmental trajectories of digital technology anxiety and digital literacy over the study period. The downward slope for anxiety juxtaposed against the upward trajectory for literacy provides initial visual evidence consistent with our hypothesis that these constructs develop in opposing directions. However, whether these patterns reflect causal relationships or merely coincidental trends requires more sophisticated modeling.
Fig. 3.
Temporal trends in digital technology anxiety and digital literacy across three measurement waves
Correlation analyses yielded several noteworthy findings. Within-construct stability coefficients—correlations between the same variable measured at adjacent time points—ranged from 0.61 to 0.73, indicating moderate-to-strong temporal consistency [43]. This pattern suggests that while teachers’ anxiety and literacy levels showed considerable stability, sufficient variance remained to detect developmental changes. More critically for our theoretical model, cross-construct correlations consistently demonstrated negative associations between anxiety and literacy. Contemporaneous correlations (measuring the same time point) ranged from − 0.52 to − 0.58, all statistically significant at p < .001.
Cross-lagged correlations proved particularly informative regarding potential directional relationships. Anxiety at earlier time points negatively predicted literacy at subsequent points (e.g., Anxiety_T1 with Literacy_T2: r = − .44; Anxiety_T2 with Literacy_T3: r = − .47), even when controlling for literacy’s autoregressive effects through partial correlations. Interestingly, reverse pathways also emerged, though with somewhat weaker magnitudes: earlier literacy levels negatively predicted subsequent anxiety (e.g., Literacy_T1 with Anxiety_T2: r = − .41; Literacy_T2 with Anxiety_T3: r = − .45). These bidirectional associations hint at potentially reciprocal influences that our cross-lagged panel models would need to disentangle.
The stability of correlation patterns across time—negative relationships maintained consistent strength at all measurement occasions—suggested that the anxiety-literacy association represented a robust phenomenon rather than a time-specific artifact. This temporal consistency strengthened confidence in pursuing causal modeling while highlighting the need to determine which variable exerted stronger prospective influence on the other.
Cross-lagged panel model testing
To rigorously examine directional relationships between digital technology anxiety and digital literacy development, we constructed a cross-lagged panel model incorporating data from all three measurement waves. This analytical approach offers distinct advantages over simpler regression techniques by simultaneously modeling temporal stability (autoregressive paths), bidirectional influences (cross-lagged paths), and within-time associations (contemporaneous correlations), thereby providing clearer evidence regarding causal precedence [38]. The model included demographic and contextual control variables as covariates predicting both anxiety and literacy at each time point.
Model fit assessment indicated excellent correspondence between our theoretical specification and the observed data. The chi-square test yielded χ² = 287.34 (df = 124, p < .001), with the ratio χ²/df = 2.32 falling below the recommended threshold of 3.0 [44].
Comparative fit indices exceeded conventional benchmarks: CFI = 0.97 and TLI = 0.96 (both > 0.95), while absolute fit indices demonstrated close approximation: RMSEA = 0.04 (90% CI [0.03, 0.05]) and SRMR = 0.03 (both < 0.05). These multiple indicators converged in supporting the model’s adequacy for interpreting parameter estimates.
As shown in Fig. 4, the cross-lagged panel model reveals asymmetric patterns of prospective influence between anxiety and digital literacy. Autoregressive paths confirmed substantial temporal stability for both constructs across six-month intervals, with coefficients ranging from 0.66 to 0.71 (all p < .001). This stability is unsurprising given that psychological characteristics and professional competencies tend to exhibit considerable continuity over moderate time spans. More critically, these strong autoregressive effects make the significant cross-lagged paths particularly meaningful—they represent predictive effects above and beyond each construct’s natural tendency toward stability.
Fig. 4.
Cross-lagged panel model showing standardized path coefficients
Table 5 summarizes all path coefficients with associated inferential statistics. Table 5 shows that digital technology anxiety consistently predicted subsequent decreases in digital literacy development. Specifically, T1 anxiety negatively predicted T2 literacy (β = − 0.23, SE = 0.04, Z = -5.75, p < .001, 95% CI [-0.31, − 0.15]), even after controlling for T1 literacy levels. This pattern replicated across the second temporal interval: T2 anxiety negatively predicted T3 literacy (β = − 0.19, SE = 0.04, Z = -4.75, p < .001, 95% CI [-0.27, − 0.11]). The standardized coefficients, while modest in absolute magnitude, represent meaningful effects when considering that they reflect incremental prediction beyond powerful autoregressive influences [45].
Table 5.
Cross-lagged panel model path coefficients and significance tests
| Path | β | SE | Z | p | 95% CI |
|---|---|---|---|---|---|
| Autoregressive Paths | |||||
| Anxiety_T1 → Anxiety_T2 | 0.66*** | 0.03 | 22.00 | < 0.001 | [0.60, 0.72] |
| Anxiety_T2 → Anxiety_T3 | 0.69*** | 0.03 | 23.00 | < 0.001 | [0.63, 0.75] |
| Literacy_T1 → Literacy_T2 | 0.67*** | 0.03 | 22.33 | < 0.001 | [0.61, 0.73] |
| Literacy_T2 → Literacy_T3 | 0.71*** | 0.03 | 23.67 | < 0.001 | [0.65, 0.77] |
| Cross-Lagged Paths | |||||
| Anxiety_T1 → Literacy_T2 | − 0.23*** | 0.04 | -5.75 | < 0.001 | [-0.31, − 0.15] |
| Anxiety_T2 → Literacy_T3 | − 0.19*** | 0.04 | -4.75 | < 0.001 | [-0.27, − 0.11] |
| Literacy_T1 → Anxiety_T2 | − 0.08* | 0.04 | -2.00 | 0.046 | [-0.16, − 0.01] |
| Literacy_T2 → Anxiety_T3 | − 0.06 | 0.04 | -1.50 | 0.134 | [-0.14, 0.02] |
| Contemporaneous Correlations | |||||
| Anxiety_T1 ↔ Literacy_T1 | − 0.52*** | 0.03 | -17.33 | < 0.001 | [-0.58, − 0.46] |
| Anxiety_T2 ↔ Literacy_T2 | − 0.45*** | 0.03 | -15.00 | < 0.001 | [-0.51, − 0.39] |
N = 1,247, β standardized path coefficient, SE standard error , CI confidence interval
***p < .001, **p < .01, *p < .05
Examining reverse pathways—from literacy to subsequent anxiety—revealed markedly different patterns. While T1 literacy showed a small negative prediction of T2 anxiety (β = − 0.08, SE = 0.04, Z = -2.00, p = .046), this effect barely reached statistical significance and fell substantially below conventional thresholds for practical significance [46]. More tellingly, T2 literacy failed to significantly predict T3 anxiety (β = − 0.06, SE = 0.04, Z = -1.50, p = .134), with confidence intervals spanning zero. This asymmetry in cross-lagged effects provides compelling evidence that anxiety primarily drives literacy development rather than the reverse—teachers’ anxiety levels shape subsequent competency growth more powerfully than competency shapes subsequent anxiety.
Contemporaneous correlations at T1 and T2 indicated negative within-time associations between constructs, consistent with our earlier correlation analyses. These correlations do not imply causation but rather reflect shared variance at each measurement occasion, potentially stemming from unmeasured third variables or reciprocal influences operating at shorter timescales than our six-month measurement intervals captured.
Mediation effect testing
Having established that digital technology anxiety prospectively predicts digital literacy development, we proceeded to examine psychological mechanisms underlying this relationship. Grounded in cognitive appraisal theory, we hypothesized that anxiety influences literacy growth through interpretive processes—specifically, through threat appraisals, resource appraisals, and consequent coping strategies. We constructed a multiple mediation model wherein T1 anxiety served as the predictor, T2 threat appraisal, resource appraisal, and coping strategies functioned as parallel and sequential mediators, and T3 digital literacy represented the outcome variable [47]. This temporal sequencing—predictor at wave 1, mediators at wave 2, outcome at wave 3—strengthens causal inference by ensuring that mediating processes occur after the predictor but before the outcome.
Mediation testing employed bias-corrected bootstrap procedures with 5,000 resamples to generate confidence intervals for indirect effects, a method that makes no distributional assumptions about sampling distributions and provides more accurate Type I error rates than traditional Sobel tests [48]. We estimated total effects (anxiety → literacy without mediators), direct effects (anxiety → literacy controlling for mediators), and specific indirect effects through each proposed mediating pathway. Significant mediation requires that confidence intervals for indirect effects exclude zero.
Figure 5 presents the complete mediation model with standardized path coefficients and significance indicators. The model demonstrates multiple pathways through which anxiety influences literacy development, consistent with our theoretical framework. Regarding the threat appraisal measure, items were reverse-coded such that higher scores indicate lower threat perception (i.e., more benign interpretation of technological demands). With this coding direction, T1 anxiety strongly predicted T2 threat appraisal (β = − 0.38, p < .001), indicating that teachers with higher anxiety levels scored lower on threat appraisal—meaning they perceived technological demands as more threatening to their professional competence. Conversely, teachers with lower anxiety exhibited higher threat appraisal scores, reflecting more benign interpretations of technological challenges. Anxiety also negatively predicted resource appraisal (β = − 0.31, p < .001), suggesting that anxious teachers perceived fewer coping resources available. Additionally, anxiety predicted less adaptive coping strategies (β = − 0.26, p < .001), with anxious teachers showing greater reliance on avoidance and emotion-focused coping rather than problem-focused approaches.
Fig. 5.
Mediation model showing path coefficients and effect decomposition
These appraisal processes, in turn, predicted T3 digital literacy. Threat appraisal negatively predicted literacy development (β = − 0.24, p < .001), while resource appraisal positively predicted literacy (β = 0.29, p < .001). Adaptive coping strategies also positively predicted literacy growth (β = 0.15, p < .01), though with smaller magnitude than appraisal variables. Notably, after accounting for these mediating pathways, the direct effect of anxiety on literacy diminished substantially but remained significant (β = − 0.12, p < .01), indicating partial rather than complete mediation.
Table 6 summarizes the decomposition of total effects into direct and specific indirect components. As presented in Table 6, the total effect of anxiety on literacy development was − 0.34 (SE = 0.04, 95% CI [-0.42, − 0.26]), representing the combined influence through all pathways. The direct effect accounted for approximately 35% of this total (-0.12), while indirect effects through mediating variables comprised the remaining 65% (-0.22). This pattern supports our theoretical assertion that anxiety operates largely through cognitive interpretive mechanisms rather than exerting purely direct influences on competency development [49].
Table 6.
Mediation effect decomposition and bootstrap test results
| Effect Type | Effect | SE | 95% CI Lower | 95% CI Upper | Relative Effect % | κ² |
|---|---|---|---|---|---|---|
| Total Effect | − 0.34*** | 0.04 | − 0.42 | − 0.26 | 100.0 | - |
| Direct Effect | − 0.12** | 0.04 | − 0.20 | − 0.04 | 35.3 | - |
| Total Indirect Effect | − 0.22*** | 0.03 | − 0.28 | − 0.16 | 64.7 | 0.18 |
| Threat Appraisal (single) | − 0.09*** | 0.02 | − 0.13 | − 0.05 | 26.5 | 0.08 |
| Resource Appraisal (single) | − 0.09*** | 0.02 | − 0.13 | − 0.05 | 26.5 | 0.08 |
| Coping Strategies (single) | − 0.04** | 0.01 | − 0.06 | − 0.02 | 11.8 | 0.03 |
| Threat → Resource (chain) | − 0.02* | 0.01 | − 0.04 | − 0.01 | 5.9 | 0.02 |
| Threat → Coping (chain) | − 0.01* | 0.01 | − 0.03 | − 0.004 | 2.9 | 0.01 |
| Resource → Coping (chain) | − 0.01* | 0.01 | − 0.03 | − 0.002 | 2.9 | 0.01 |
N = 1,247, Bootstrap samples 5,000, SE standard error, CI confidence interval, κ² kappa-squared effect size for indirect effects [45].
Confidence intervals not containing zero indicate significant indirect effects
***p < .001, **p < .01, *p < .05
The kappa-squared (κ²) effect size provides a standardized measure of mediation effect magnitude, with values of 0.01, 0.09, and 0.25 representing small, medium, and large effects respectively [45]. The total indirect effect (κ² = 0.18) approaches a medium effect size, while individual mediating pathways through threat appraisal and resource appraisal each demonstrate small-to-medium effects (κ² = 0.08).
Among specific indirect pathways, threat appraisal and resource appraisal demonstrated roughly equivalent mediating strength, each accounting for approximately 27% of the total effect. The findings in Table 6 indicate that anxiety’s influence operated substantially through both threat interpretations (seeing technology demands as dangerous to professional identity) and resource evaluations (perceiving inadequate capacity to meet demands). Coping strategies played a smaller but still significant mediating role (12% of total effect), suggesting that while behavioral responses matter, the interpretive processes preceding them carry greater weight in determining developmental outcomes.
Chain mediation paths—wherein anxiety influenced literacy through sequential mediators—emerged as statistically significant but relatively minor contributors. The threat-to-resource chain (anxiety → threat appraisal → resource appraisal → literacy) accounted for 6% of the total effect, indicating that threat interpretations partially shaped subsequent resource evaluations. Chains involving coping strategies contributed even smaller proportions (3% each), likely because coping represents a more distal consequence of appraisal processes. Collectively, these findings illuminate how anxiety triggers a cascade of cognitive interpretations and behavioral adaptations that cumulatively constrain professional learning and competency development.
Discussion
Summary of key findings
This longitudinal investigation employed a three-wave panel design spanning one academic year to examine how teacher digital technology anxiety influences digital literacy development through cognitive appraisal mechanisms. Our findings converge in supporting several key conclusions that advance both theoretical understanding and practical intervention design.
Digital technology anxiety at earlier time points predicted decreased digital literacy development at subsequent assessments, with cross-lagged effects (β = − 0.19 to − 0.23) demonstrating clear temporal precedence over reverse pathways. The asymmetry between forward and reverse paths proved particularly striking: while anxiety consistently predicted literacy changes across both six-month intervals, literacy’s influence on subsequent anxiety was either marginally significant or non-significant. This pattern clarifies a directionality question that prior cross-sectional research could not resolve.
Cognitive appraisal processes—specifically threat appraisals and resource appraisals—mediated approximately 65% of anxiety’s total effect on literacy growth. Teachers experiencing higher anxiety interpreted technological demands as more threatening to their professional competence and perceived fewer coping resources available, patterns that subsequently constrained their competency development. Threat appraisal and resource appraisal demonstrated roughly equivalent mediating strength, each accounting for approximately 27% of the total effect. Coping strategies contributed an additional 12% through indirect pathways. These results support all three hypotheses and illuminate the psychological mechanisms underlying a relationship previously documented only correlationally.
Theoretical implications
Our findings advance theoretical understanding in several important directions. First, we established temporal precedence in the anxiety-literacy relationship through rigorous longitudinal design. While prior research documented cross-sectional correlations, our cross-lagged analyses demonstrated that anxiety predicts subsequent literacy changes more strongly than literacy predicts anxiety changes [50]. This finding contradicts theoretical accounts positioning anxiety as merely reactive to competence deficits; instead, anxiety appears to function primarily as a driver of developmental trajectories.
Second, we extended cognitive appraisal theory into teacher digital literacy development contexts, demonstrating that this framework—originally developed for understanding stress responses—provides valuable explanatory power for professional competency growth processes. The finding that approximately 65% of anxiety’s effect operates through interpretive mechanisms represents a substantial theoretical advance, shifting focus from anxiety as a direct barrier to anxiety as initiating cognitive cascades that shape learning engagement [51]. This temporal extension matters because it suggests that momentary interpretations crystallize into relatively stable processing tendencies that channel subsequent learning engagement.
Third, by specifying the roles of threat appraisal and resource appraisal as parallel mediators with roughly equivalent strength, we provide nuanced understanding of the psychological processes connecting affective states to developmental outcomes. Our findings align meaningfully with Self-Determination Theory’s emphasis on basic psychological needs [52]. Threat appraisals may undermine teachers’ sense of competence—the need to feel effective in one’s actions—by framing technological challenges as evidence of inadequacy rather than opportunities for growth. Resource appraisals connect to both autonomy (perceiving choices and control over one’s learning) and relatedness (recognizing available social support). When anxious teachers perceive limited resources, they may experience simultaneous threats to multiple basic needs, compounding the negative impact on professional learning engagement [53]. This theoretical integration suggests that interventions should address not only cognitive reframing but also the fundamental psychological needs that appraisals reflect.
Moreover, our results contribute to digital literacy scholarship by identifying affective-cognitive mechanisms often overlooked in competency development models. Existing frameworks emphasize training access, organizational support, and technical infrastructure while inadequately addressing how teachers’ psychological interpretations of these factors shape their capacity to benefit from supportive conditions. The cognitive appraisal perspective fills this gap by specifying how subjective evaluations mediate the relationship between environmental demands and developmental outcomes.
Practical implications
Our findings point toward multilevel intervention strategies with specific implementation recommendations. At the individual level, professional development should incorporate explicit cognitive reappraisal training through structured workshops. We recommend designing “cognitive reappraisal workshops” where teachers practice identifying automatic threat interpretations of technological challenges and systematically reframing them as learning opportunities. For instance, teachers might learn to transform thoughts like “This new platform will expose my incompetence” into “Learning this platform is a normal professional growth process that takes time.” Such workshops could employ guided reflection exercises, peer discussion of common technology fears, and practice scenarios with feedback [54]. Building technology self-efficacy through graduated mastery experiences—beginning with simple tools and progressively advancing to complex applications—can further enhance resource appraisals.
Resource appraisal enhancement requires both perceptual shifts and concrete support provision. We propose establishing “digital mentoring systems” pairing less confident teachers with technology-proficient colleagues who provide ongoing technical support and emotional validation. Unlike one-time training sessions, mentoring relationships offer sustained assistance that anxious teachers can access when challenges arise, directly addressing perceived resource deficits. Schools might also create “technology help desks” staffed during planning periods to provide immediate troubleshooting support, reducing the perceived costs of seeking assistance.
Organizationally, school leaders should cultivate a “culture of technological trial-and-error” that normalizes implementation difficulties and celebrates learning from mistakes. This involves reframing technology integration from high-stakes performance to low-stakes experimentation. Concrete practices include sharing stories of initial technology failures that eventually succeeded, establishing “sandbox” periods where teachers can experiment without evaluation consequences, and publicly acknowledging that even technology-proficient colleagues encountered learning curves. Such cultural shifts directly target the threat appraisals our findings identified as mediating anxiety’s effects. School leaders should frame technology initiatives as growth opportunities rather than accountability mechanisms, creating psychologically safe environments where teachers can acknowledge struggles without judgment.
At the policy level, educational authorities should recognize that teacher digital literacy development involves psychological dimensions often overlooked in infrastructure-focused initiatives. Professional development funding should support not only technical training but also cognitive-emotional intervention programs. Teacher evaluation systems should avoid penalizing early-stage technology integration difficulties, which can amplify threat appraisals and create the very avoidance behaviors that impede skill development. Policies encouraging collaborative professional learning communities and protecting time for technology experimentation would create systemic conditions supporting the appraisal shifts our findings suggest are necessary for literacy growth.
Limitations and future directions
Several limitations temper our conclusions and suggest directions for future research. First, while our sample captured demographic diversity across school types and regions, all participants came from China, potentially limiting generalizability to educational systems with different technological infrastructures or cultural orientations toward professional learning [55]. Cross-cultural replication studies would clarify whether the appraisal mechanisms we identified operate similarly across diverse educational contexts.
Second, and critically, our digital literacy measure relied on self-reported competence ratings across four domains. This approach captures perceived competence or self-efficacy rather than objectively assessed literacy performance. Teachers may overestimate or underestimate their actual capabilities, and perceived competence does not always correspond to demonstrated skill [17]. Consequently, our findings regarding “literacy development” more precisely reflect changes in teachers’ confidence and self-assessed abilities rather than verified skill acquisition. Future research should incorporate performance-based assessments—such as technology integration tasks scored by trained observers or digital artifact evaluation—to examine whether anxiety similarly constrains objective competency growth.
Third, our six-month measurement intervals, though theoretically justified for detecting developmental changes, may miss shorter-term fluctuations in anxiety and appraisal patterns. Daily diary designs or experience-sampling methods could reveal how within-person variations in affective states and interpretations relate to learning engagement at more granular timescales. Such approaches would complement our between-person findings with within-person process insights.
Fourth, our use of traditional Cross-Lagged Panel Model (CLPM) rather than Random Intercept Cross-Lagged Panel Model (RI-CLPM) means we cannot definitively separate between-person differences from within-person processes [38]. While our theoretical focus on between-person phenomena and the convergence difficulties encountered with RI-CLPM justified this analytic choice, future investigations using designs optimized for within-person analysis would extend our findings and clarify whether the mechanisms operate similarly at individual and population levels.
Future research should also examine boundary conditions and moderating factors. Individual differences in growth mindset, technology self-efficacy, or tolerance for ambiguity might determine which teachers develop anxiety in response to technological demands and how strongly anxiety constrains their learning. Similarly, organizational culture and leadership practices likely moderate whether anxious teachers can access supportive resources that buffer appraisal effects. Mixed-methods approaches combining quantitative modeling with qualitative investigation of teachers’ lived experiences would enrich understanding of appraisal processes and identify intervention leverage points not apparent in variable-centered analyses alone.
Conclusions
This study examined how teacher digital technology anxiety influences digital literacy development through cognitive appraisal mechanisms using a three-wave longitudinal design. Our findings demonstrate that anxiety prospectively predicts literacy development with clear temporal precedence, and that cognitive appraisal processes—threat appraisals and resource appraisals—mediate approximately 65% of this relationship.
Theoretical contributions
This investigation makes three primary theoretical contributions to educational technology and teacher professional development scholarship. First, we established temporal precedence in the anxiety-literacy relationship, clarifying that anxiety functions primarily as a driver rather than consequence of literacy development. This finding resolves directional ambiguity that persisted in cross-sectional research and suggests that intervention efforts should prioritize anxiety reduction as a pathway to competency growth.
Second, we extended cognitive appraisal theory into teacher digital literacy development contexts, demonstrating that interpretive processes originally studied in stress and health domains provide valuable explanatory power for understanding professional competency trajectories. The substantial mediation effect (65% of total effect) underscores the importance of subjective evaluation processes in shaping developmental outcomes.
Third, by specifying threat appraisal and resource appraisal as parallel mediators with equivalent strength, and integrating these findings with Self-Determination Theory’s basic psychological needs framework, we provide a more comprehensive theoretical account of the affective-cognitive mechanisms underlying technology-related professional development.
Practical recommendations
Based on our findings, we offer the following recommendations organized across three levels:
Individual Level:
1. Implement cognitive reappraisal workshops helping teachers identify and restructure threat-oriented interpretations of technological challenges
2. Design graduated mastery experiences building technology self-efficacy progressively2. Design graduated mastery experiences building technology self-efficacy progressively
3. Provide training in adaptive coping strategies emphasizing problem-focused approaches
Institutional Level:
1. Establish digital mentoring systems pairing anxious teachers with technology-proficient colleagues
2. Create technology help desks offering immediate troubleshooting support
3. Cultivate organizational cultures normalizing technological trial-and-error
4. Frame technology initiatives as growth opportunities rather than accountability mechanisms
5. Establish experimentation periods where implementation difficulties are expected and valued
Policy Level:
1. Allocate professional development funding for cognitive-emotional interventions alongside technical training
2. Design teacher evaluation systems that avoid penalizing early-stage technology integration difficulties
3. Support collaborative professional learning communities focused on technology integration
4. Protect dedicated time for technology experimentation within teachers' schedules
Ultimately, helping teachers navigate the psychological challenges of digital transformation requires both rigorous empirical evidence regarding mechanisms and nuanced appreciation for the complex human experiences underlying statistical patterns. Our findings suggest that addressing cognitive appraisals—how teachers interpret technological demands and evaluate their coping resources—may prove as important as providing technical training or infrastructure support.
Supplementary Information
Acknowledgements
The authors thank all participating teachers and school administrators for their time and cooperation in this longitudinal study. We also acknowledge the educational bureaus in the three participating provinces for facilitating access to schools.
Abbreviations
- CFI
Comparative Fit Index
- TLI
Tucker-Lewis Index
- RMSEA
Root Mean Square Error of Approximation
- SRMR
Standardized Root Mean Square Residual
- CLPM
Cross-Lagged Panel Model
- SEM
Structural Equation Modeling
- T1
Time 1 (first measurement wave)
- T2
Time 2 (second measurement wave)
- T3
Time 3 (third measurement wave)
- CFA
Confirmatory Factor Analysis
- CI
Confidence Interval
Authors’ contributions
RL conceptualized the research framework, designed the study methodology, coordinated data collection across all three waves, conducted statistical analyses, interpreted the results, and drafted the original manuscript. XL contributed to the theoretical framework development, provided guidance on research design and analytical strategies, supervised the overall project, and critically revised the manuscript for important intellectual content. Both authors reviewed and approved the final manuscript for submission.
Funding
Not Applicable.
Data availability
The datasets generated and analyzed during this study are provided in Supplementary File 1, which contains anonymized survey responses, computed scale scores, and analysis syntax. Additional raw data supporting the findings are available from the corresponding author upon reasonable request, subject to appropriate data use agreements and ethical approval for secondary data analysis.
Declarations
Ethics approval and consent to participate
This study was approved by the Research Ethics Committee of Qufu Normal University (Reference Number: IRB-2023-EDU-078). All participants provided written informed consent prior to enrollment. The study was conducted in accordance with the Declaration of Helsinki and relevant national regulations governing educational research in the People’s Republic of China. Participants were informed of their right to withdraw from the study at any time without penalty, and all data were collected and stored in compliance with data protection regulations.
Consent for publication
All authors have reviewed the manuscript and consent to its publication. No identifiable information regarding participants has been included in this manuscript. All data have been reported in aggregate form to protect participant confidentiality. Since no identifying images or personal details of participants are presented that could compromise anonymity, individual consent for publication from participants is not applicable.
Competing interests
The authors declare no competing interests.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Fernández-Batanero JM, Román-Graván P, Reyes-Rebollo MM, Montenegro-Rueda M. Impact of educational technology on teacher stress and anxiety: A literature review. Int J Environ Res Public Health. 2021;18(2):548. 10.3390/ijerph18020548. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Dong Y, Xu C, Chai CS, Zhai X. Exploring the structural relationship among teachers’ technostress, technological pedagogical content knowledge (TPACK), computer self-efficacy and school support. Asia-Pacific Educ Researcher. 2020;29(2):147–57. 10.1007/s40299-019-00461-5. [Google Scholar]
- 3.Falloon G. From digital literacy to digital competence: The teacher digital competency (TDC) framework. Education Tech Research Dev. 2020;68(5):2449–72. 10.1007/s11423-020-09767-4. [Google Scholar]
- 4.Bayzan Ş. Are digital teachers anxious? An investigation of the relationship between teachers’ digital citizenship behaviors and online privacy concerns. Educ Inform Technol. 2025;30(5):6809–37. 10.1007/s10639-024-13133-9. [Google Scholar]
- 5.Lazarus RS, Folkman S. Stress, appraisal, and coping. Springer Publishing Company; 1984.
- 6.La Torre G, Esposito A, Sciarra I, Chiappetta M. Definition, symptoms and risk of techno-stress: A systematic review. Int Arch Occup Environ Health. 2019;92(1):13–35. 10.1007/s00420-018-1352-1. [DOI] [PubMed] [Google Scholar]
- 7.Barendsen E, Timmers M. Factors influencing teacher’s technostress experienced in using emerging technology: A qualitative study. Technol Knowl Learn. 2022;29(2):389–419. 10.1007/s10758-022-09607-9. [Google Scholar]
- 8.Kim LE, Oxley L, Asbury K. My brain feels like a browser with 100 tabs open: A longitudinal study of teachers’ mental health and well-being during the COVID-19 pandemic. Br J Educ Psychol. 2022;92(1):299–318. 10.1111/bjep.12450. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Zhang W, Wang Y, Yang L, Wang C. AI anxiety among pre-service teachers: A cross-cultural study. Computers Education: Artif Intell. 2023;4:100135. 10.1016/j.caeai.2023.100135. [Google Scholar]
- 10.Herman KC, Hickmon-Rosa J, Reinke WM. Empirically derived profiles of teacher stress, burnout, self-efficacy, and coping and associated student outcomes. J Posit Behav Interventions. 2020;20(2):90–100. 10.1177/1098300717732066. [Google Scholar]
- 11.Sharma S, Gupta B. Investigating the role of technostress, cognitive appraisal and coping strategies on students’ learning performance in higher education: A multidimensional transactional theory of stress approach. Inform Technol People. 2023;36(3):1242–68. 10.1108/ITP-06-2021-0505. [Google Scholar]
- 12.Chandwani R, Kumar A, Grewal D. Coping with online teaching during COVID-19: The role of self-efficacy and teacher support. Educ Inform Technol. 2021;27(3):3491–516. 10.1007/s10639-021-10769-3. [Google Scholar]
- 13.Aldosemani T. Inservice teachers’ perceptions of a professional development plan based on SAMR model: A case study. Turkish Online J Educational Technol. 2020;19(1):46–53. [Google Scholar]
- 14.Ferrari A, Punie Y, Brečko BN. DIGCOMP: A framework for developing and understanding digital competence in Europe. Publications Office of the European Union; 2013. 10.2791/52816.
- 15.Koehler MJ, Mishra P. What is technological pedagogical content knowledge (TPACK)? Contemp Issues Technol Teacher Educ. 2009;9(1):60–70. [Google Scholar]
- 16.Mishra P, Koehler MJ. Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers Coll Record. 2006;108(6):1017–54. 10.1111/j.1467-9620.2006.00684.x. [Google Scholar]
- 17.Redecker C, Punie Y. (2017). European framework for the digital competence of educators: DigCompEdu (No. JRC107466). Joint Research Centre. 10.2760/159770
- 18.Tondeur J, van Braak J, Sang G, Voogt J, Fisser P, Ottenbreit-Leftwich A. Preparing pre-service teachers to integrate technology in education: A synthesis of qualitative evidence. Comput Educ. 2017;59(1):134–44. 10.1016/j.compedu.2011.10.009. [Google Scholar]
- 19.Trust T, Krutka DG, Carpenter JP. Together we are better: Professional learning networks for teachers. Comput Educ. 2021;102:15–34. 10.1016/j.compedu.2016.06.007. [Google Scholar]
- 20.OECD. OECD skills outlook 2019: Thriving in a digital world. OECD Publishing. 2019. 10.1787/df80bc12-en. [Google Scholar]
- 21.Guillen-Gamez FD, Mayorga-Fernández MJ. Identification of variables that predict teachers’ attitudes toward ICT in higher education for teaching and research: A study with regression. Sustainability. 2020;12(4):1312. 10.3390/su12041312. [Google Scholar]
- 22.Spătaru AA, Tudorică RA, Maricuțoiu LP. A longitudinal examination of appraisal, coping, stress, and mental health in students: A cross-lagged panel network analysis. Stress Health. 2024;40(4):e3450. 10.1002/smi.3450. [DOI] [PubMed] [Google Scholar]
- 23.Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: Toward a unified view. MIS Q. 2003;27(3):425–78. 10.2307/30036540. [Google Scholar]
- 24.Simães C, Couto A, Morais C, Gomes AR, Fontes L. Stress and cognitive appraisal in university students: Explaining burnout over time. Ansiedad y Estrés. 2024;30(2):63–72. 10.5093/anyes2024a9. [Google Scholar]
- 25.Kyriacou C. (2011). Teacher stress: From prevalence to resilience. In J. Langan-Fox & C. L. Cooper, editors, Handbook of stress in the occupations (pp. 161–173). Edward Elgar Publishing. 10.4337/9780857931153.00015
- 26.Boswell WR, Olson-Buchanan JB, LePine MA. Relations between stress and work outcomes: The role of felt challenge, job control, and psychological strain. J Vocat Behav. 2004;64(1):165–81. 10.1016/S0001-8791(03)00049-6. [Google Scholar]
- 27.Cole DA, Maxwell SE. Testing mediational models with longitudinal data: Questions and tips in the use of structural equation modeling. J Abnorm Psychol. 2003;112(4):558–77. 10.1037/0021-843X.112.4.558. [DOI] [PubMed] [Google Scholar]
- 28.Tipton E. Stratified sampling using cluster analysis: A sample selection strategy for improved generalizations from experiments. Eval Rev. 2013;37(2):109–39. 10.1177/0193841X13516324. [DOI] [PubMed] [Google Scholar]
- 29.Xie Y, Zhang M. Synergy of higher education resources and digital infrastructure construction in China: Regional differences, dynamic evolution and trend forecasting. PLoS ONE. 2024;19(6):e0304613. 10.1371/journal.pone.0304613. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Maslach C, Schaufeli WB, Leiter MP. Job burnout. Ann Rev Psychol. 2001;52:397–422. 10.1146/annurev.psych.52.1.397. [DOI] [PubMed] [Google Scholar]
- 31.Little RJA. A test of missing completely at random for multivariate data with missing values. J Am Stat Assoc. 1988;83(404):1198–202. 10.1080/01621459.1988.10478722. [Google Scholar]
- 32.Heinssen RK, Glass CR, Knight LA. Assessing computer anxiety: Development and validation of the Computer Anxiety Rating Scale. Comput Hum Behav. 1987;3(1):49–59. 10.1016/0747-5632(87)90010-0. [Google Scholar]
- 33.Ng W. Can we teach digital natives digital literacy? Comput Educ. 2012;59(3):1065–78. 10.1016/j.compedu.2012.04.016. [Google Scholar]
- 34.Gross JJ. Emotion regulation: Current status and future prospects. Psychol Inq. 2015;26(1):1–26. 10.1080/1047840X.2014.940781. [Google Scholar]
- 35.Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct Equation Modeling: Multidisciplinary J. 1999;6(1):1–55. 10.1080/10705519909540118. [Google Scholar]
- 36.Podsakoff PM, MacKenzie SB, Lee JY, Podsakoff NP. Common method biases in behavioral research: A critical review of the literature and recommended remedies. J Appl Psychol. 2003;88(5):879–903. 10.1037/0021-9010.88.5.879. [DOI] [PubMed] [Google Scholar]
- 37.Kline RB. Principles and practice of structural equation modeling. 5th ed. Guilford Press; 2023.
- 38.Hamaker EL, Kuiper RM, Grasman RP. A critique of the cross-lagged panel model. Psychol Methods. 2015;20(1):102–16. 10.1037/a0038889. [DOI] [PubMed] [Google Scholar]
- 39.Huang FL, Li X. Using cluster-robust standard errors when analyzing group-randomized trials with few clusters. Behav Res Methods. 2022;54(3):1181–99. 10.3758/s13428-021-01627-0. [DOI] [PubMed] [Google Scholar]
- 40.Hayes AF. Introduction to mediation, moderation, and conditional process analysis: A regression-based approach. 3rd ed. Guilford Press; 2022.
- 41.George D, Mallery P. (2020). IBM SPSS statistics 26 step by step: A simple guide and reference (16th ed.). Routledge. 10.4324/9780429056765
- 42.Cohen J. Statistical power analysis for the behavioral sciences. 2nd ed. Lawrence Erlbaum Associates; 1988.
- 43.Asparouhov T, Muthén B. A single-level random-effects cross-lagged panel model for longitudinal mediation analysis. Behav Res Methods. 2018;50(6):2118–38. 10.3758/s13428-017-0979-2. [DOI] [PubMed] [Google Scholar]
- 44.Schreiber JB, Nora A, Stage FK, Barlow EA, King J. Reporting structural equation modeling and confirmatory factor analysis results: A review. J Educational Res. 2006;99(6):323–38. 10.3200/JOER.99.6.323-338. [Google Scholar]
- 45.MacKinnon DP, Lockwood CM, Williams J. Confidence limits for the indirect effect: Distribution of the product and resampling methods. Multivar Behav Res. 2004;39(1):99–128. 10.1207/s15327906mbr3901_4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Ferguson CJ. An effect size primer: A guide for clinicians and researchers. Prof Psychology: Res Pract. 2009;40(5):532–8. 10.1037/a0015808. [Google Scholar]
- 47.Selig JP, Preacher KJ. Mediation models for longitudinal data in developmental research. Res Hum Dev. 2009;6(2–3):144–64. 10.1080/15427600902911247. [Google Scholar]
- 48.Preacher KJ, Hayes AF. Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behav Res Methods. 2008;40(3):879–91. 10.3758/BRM.40.3.879. [DOI] [PubMed] [Google Scholar]
- 49.Baron RM, Kenny DA. The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. J Personal Soc Psychol. 1986;51(6):1173–82. 10.1037/0022-3514.51.6.1173. [DOI] [PubMed] [Google Scholar]
- 50.Cheng G, Chau J. Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An empirical study of a blended learning course. Br J Edu Technol. 2021;47(2):257–78. 10.1111/bjet.12243. [Google Scholar]
- 51.Bandura A. Social cognitive theory: An agentic perspective. Ann Rev Psychol. 2001;52:1–26. 10.1146/annurev.psych.52.1.1. [DOI] [PubMed] [Google Scholar]
- 52.Ryan RM, Deci EL. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am Psychol. 2000;55(1):68–78. 10.1037/0003-066X.55.1.68. [DOI] [PubMed] [Google Scholar]
- 53.Deci EL, Ryan RM. The what and why of goal pursuits: Human needs and the self-determination of behavior. Psychol Inq. 2000;11(4):227–68. 10.1080/1047840X.2000.9674913. [Google Scholar]
- 54.Admiraal W, Lockhorst D, Smit B, Weijers S. The integrative model of behavior prediction to explain technology use in post-graduate teacher education programs in the Netherlands. Int J High Educ. 2013;2(4):172–8. 10.5430/ijhe.v2n4p172. [Google Scholar]
- 55.Zhao Y, Frank KA. Factors affecting technology uses in schools: An ecological perspective. Am Educ Res J. 2003;40(4):807–40. 10.3102/00028312040004807. [Google Scholar]
- 56.Skaalvik EM, Skaalvik S. Still motivated to teach? A study of school context variables, stress and job satisfaction among teachers in senior high school. Soc Psychol Educ. 2017;20(1):15–37. 10.1007/s11218-016-9363-9. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The datasets generated and analyzed during this study are provided in Supplementary File 1, which contains anonymized survey responses, computed scale scores, and analysis syntax. Additional raw data supporting the findings are available from the corresponding author upon reasonable request, subject to appropriate data use agreements and ethical approval for secondary data analysis.











