Abstract
Self-efficacy is relevant in explaining performance and well-being in different domains of human behaviour. Despite this need, there are no instruments that assess self-efficacy in university teaching in virtual environments. Therefore, the objective of this study was to design the Self-Efficacy Scale for University Teaching in Virtual Environments (SSUTVE) and analyse its psychometric properties. Three studies were developed to achieve this. First, based on grounded theory, 31 university professors were interviewed in-depth, and the 10 categories that emerged were grouped into two dimensions of the construct. In the second study, 10 expert judges (university professors) evaluated the clarity, relevance, and pertinence of the items developed. In addition, 10 judges assessed the clarity of the items. Subsequently, 33 items were accepted, and the degree of agreement was acceptable (lower limit of confidence interval in Aiken's V above the expected). The third study analysed the internal structure. A total of 554 Peruvian university professors participated, and the scale presented adequate indexes of adjustment for a structure of nine correlated factors: basic technological skills, safety in virtual classes, ethical-legal aspects, guidance and/or advice in the use of technological resources (related to self-efficacy in digital competences) and planning, didactics, group management, mastery of the subject, and evaluation and feedback (related to self-efficacy in pedagogical competences). Additionally, the degree of reliability of the scores and constructs was acceptable. It was concluded that the SSUTVE presents psychometric evidence of validity and reliability for Peruvian university professors working in virtual environments.
Keywords: Validity, Reliability, Psychometrics, Teaching, Higher education
1. Introduction
The use of virtual tools is one of the resources to support the teaching-learning process [1,2] which, in the post-pandemic period, has been implemented for continuity at the higher level [3,4] and has been massified through the hybrid modality [5]. Even university students show a more accepting attitude towards online classes after the COVID-19 pandemic compared to the health emergency period [6]. For these reasons, teachers need to develop the required skills and have self-efficacy beliefs. Self-efficacy drives not only the choice of goals and commitments but also the effort and persistence embodied in their achievement [7,8].
Self-efficacy refers to the belief that a person has in their ability to perform the necessary behavioural responses to succeed in a task [7,9]. Its importance lies in its influence on the capacity for adaptation and change in human beings [10,11], which is why it has been applied to different areas of human development [12]. It favours well-being [13] and is a predictor of action [12]. In the field of education, teachers' self-efficacy is their belief in their ability to provide academic instruction and build positive learning environments [10,14], including the virtual ones.
Teachers with self-efficacy in the use of technological tools have a better handling of them and a greater willingness to use them [1], even in situations of pressure, such as the accelerated learning and overload to which teachers were exposed during the health emergency of the COVID-19 pandemic [15,16]. Therefore, teaching self-efficacy in virtual environments is a very important resource for higher education teachers, especially nowadays, where the demand for virtual education has increased considerably [17]. However, there is a lack of instruments for its evaluation, because teaching self-efficacy scales have focused on special education [7,[18], [19], [20]], inclusive practices [[21], [22], [23], [24]], preschool children [25], mathematics [26], regular basic education [[27], [28], [29], [30], [31]], higher technical education [32], collective university teaching [33], physician training [34], science teaching [35,36], and self-efficacy for research [37].
The closest thing to self-efficacy measures in the field of virtual teaching is an instrument that measures self-efficacy in technological integration using five items [38]. However, it does not incorporate elements of teaching. Another measure is the attitude of university teachers towards the ethical use of information technologies in higher education [39]. Another instrument measures competences in the use of information and communication technology by teachers in training. It includes the dimensions of competences to support students in the application of Information and Communication Technologies (ICTs) in class and competences for its instructional design [40].
Teaching self-efficacy scales are applied to face-to-face work. However, in the post-pandemic era, higher education is taught in a hybrid manner. Therefore, given the existing gaps and the need for a measurement instrument that meets the current needs, the objective of this study is to demonstrate the process of construction and psychometric analysis of the Self-Efficacy Scale for University Teaching in Virtual Environments (SSUTVE) through three studies. Using a qualitative approach, the first study aims to unveil the competences of university teaching in virtual environments, according to the experiences of Peruvian university teachers. The second study aims to obtain evidence of the content validity of the EADU-EV items. Finally, the third study aims to determine evidence of validity based on the internal structure of the SSUTVE, as well as its reliability.
1.1. Review of literature
Social cognitive theory, on which self-efficacy is based, explains how a person constructs their beliefs based on reciprocal determinism between behaviour and the environment [41]. One of its central concepts is human agency, which is based on intentionality and anticipation. This means that people have the capacity to self-regulate based on their power to set purposes, plan their behaviours, and imagine their consequences [42].
Within this theory, self-efficacy is the main construct [10,41], having a reciprocal influence on each of the following elements: outcome expectations, goals and aspirations, affective reactions, and recognition of limitations and opportunities [11]. Self-efficacy develops through experiences of success, actions of other people, verbal prompts, and emotional states [43]. Moreover, they vary in generality, strength, and level; therefore, the importance assigned to such beliefs differs according to the scope of application and cognitive schemas [7].
Although the construct of general self-efficacy has been investigated [[44], [45], [46]], for Bandura [7], self-efficacy beliefs should be understood from the different domains or specific areas of behaviour in which they are applied [7]. One of these domains is teacher self-efficacy, which is an important predictor of performance [[47], [48], [49]] and work engagement [50]. It is negatively related to students' inappropriate behaviour [51,52] and positively related to student enjoyment [53], learning engagement, and participation [54]. However, it is not related to students' academic outcomes [53,55]. The latter results, although controversial, are to be expected considering that academic performance is influenced by various factors [e.g. Refs. [[56], [57], [58], [59], [60], [61]]. Therefore, the influence teachers may exert adds to the set of explanatory factors.
Self-efficacy beliefs in teaching encompass not only the achievement of academic goals for the attainment of expected competences in professional training but also the achievement of autonomy, autonomous motivation, and academic adjustment among students [62]. However, demands change in a virtual environment. This requires the management of virtual tools and a different pedagogy that directs students’ attention and motivation to learn [63]. Therefore, teacher self-efficacy in virtual environments requires further investigation. One study collected information on self-efficacy in online teaching during the pandemic. However, a two-item Likert scale was used as a measure, with no reports of validity or reliability and a predominantly qualitative methodology [63].
The construct most closely related to self-efficacy in virtual environments is teachers' information technology integration self-efficacy. A recent meta-analysis found a moderate relationship between teachers' information technology integration self-efficacy and Technological Pedagogical Content Knowledge (TPACK) [64].
Teacher self-efficacy in virtual environments is complemented by Davis' Technology Acceptance Model (TAM) [65] and TPACK. Research has reported that teachers' self-efficacy in using a learning management system [66] and TPACK [67] favours perceived ease of use, attitudes, and intention to use the technology later. The TAM explains technology acceptance for e-learning in students from the perspective of perceived usefulness and intention to use technology [68]. However, during the COVID-19 pandemic, there was low teacher efficacy in using technology, a lack of support for online instruction [69,70], and difficulties in motivating student participation [69].
Although no consensus exists on the definition of teaching self-efficacy [7,14], this study defines teaching self-efficacy in virtual environments as teachers' beliefs about their ability to effectively use digital technologies to plan and implement suitable teaching strategies and achieve online student learning. This refers to self-confidence in the skills and knowledge required to use virtual resources and promote quality online learning experiences.
For the structure of the self-efficacy construct in virtual environments, it is considered relevant to orient the dimensions and items towards the belief or confidence in the expected competences of university teachers. Therefore, study 1, based on grounded theory, was aimed at fulfilling the objective of revealing the competences of university teaching in virtual environments, according to the experience of Peruvian university teachers. Study 2 sought to obtain evidence of content validity through expert judgement to determine the representativeness, relevance and clarity of the items. In addition, the intelligibility of the items was also assessed by a group of participants from the target population. Finally, study 3 aimed to determine the internal structure of the constructed scale and its reliability. This approach will contribute to creating a measure that helps to understand how teachers perceive and evaluate their own abilities to teach in virtual environments, according to professional expectations.
2. Study 1
An in-depth understanding of human behaviour requires the use of different study methods. Among them, grounded theory allows consideration of previous existing theories, from incipient to advanced levels, to achieve greater solidity of the generated theory [71]. Therefore, the experience of teachers in their daily work can provide relevant information to confirm, expand, or correct theoretical contents related to teaching competences.
In the university environment, the development of integrative competences must respond to the demands of the globalised world [72] and be permanently updated to offer quality instruction [73]. This has led to the digitalisation of educational processes, requiring teachers to master digital competence [[74], [75]] as part of their pedagogical skills, according to the profile of a teacher of excellence [76].
Different studies have indicated that teachers’ digital competence is framed into three competences: generic, pedagogical, and transversal. Generic competences encompass basic skills [77] that teachers should acquire expertise in information and communication technologies, including technology management [76,78], ethical, legal, and security aspects [78], and knowledge of policies regarding ICT use in education [79].
Digital pedagogical competences [80] are related to planning, curricular adaptation [76,81], creating experiences that facilitate learning, and evaluation [81]. Additionally, teacher capacity is involved in the development of students’ digital competences [80].
Cross-competences reflect on teaching practice, both in its formative and protective roles [76]. They imply a commitment to develop personally and professionally [78,80], the ability to address the affective dimension of students and promote their autonomy [81], and the ability to solve problems [79]. This type of competence will not be considered in the present study because it is compatible with soft skills. Additionally, a previous study that built a measure of self-efficacy in the professional praxis of psychology practitioners initially included the socioemotional skills of social interaction [82]. However, subsequent confirmatory factor analysis excluded this indicator [83].
Based on the categories obtained in the qualitative phase of the study, the items on perceived self-efficacy for virtual teaching can be oriented. Therefore, the aim of this study is to reveal the competences of university teaching in virtual environments based on the experiences of Peruvian teachers.
2.1. Materials and methods
2.1.1. Participants
Thirty-one university teachers from Trujillo (Peru) were interviewed and there was a refusal rate of 14 % (5). Participants were selected by convenience sampling, with 94 % (27) working in private universities and 6 % (2) in public universities. They were mostly male (n = 19; 61.29 %), with ages ranging from 31 to 60 years (M = 43.97; SD = 8.15). Their experience in university teaching ranged from 1 to 26 years (M = 10.81; SD = 6.11), and in virtual environments, synchronous (75 %) or asynchronous (25 %), from 1 to 12 years (M = 3.24; SD = 2.21). A total of 77.42 % (24) were teaching at the undergraduate level, 9.68 % (3) at the graduate level, and 12.9 % (4) at both levels. The majority were psychologists (74.19 %, 23), followed by educators (12.9 %, 4), and, in the minority, from other professions (1 chemical engineer [3 %], 1 microbiologist [3 %], 1 economist [3 %] and 1 lawyer [3 %]). Participation was voluntary, anonymous and ad honorem.
2.1.2. Instrument
An in-depth interview guide was applied, which contained the following questions: 1) What skills or abilities do you consider a university teacher should have in order to effectively perform his/her work in a face-to-face manner?; 2) What attitudes do you consider a university teacher should have in order to effectively perform his/her work?; 3) What skills or abilities do you consider a university teacher should have in order to effectively perform in virtual environments?; and 4) If you had to choose eight necessary skills that a teacher should have in order to perform in virtual environments, what would they be?
The questions were selected in order to cover the largest number of responses regarding the skills expected of teachers in virtual environments, and then, based on this, to focus the items on the beliefs of confidence or mastery of each of the skills detected.
2.1.3. Procedure
The research project, which included the three studies, was approved by the Ethics Committee of the School of Psychology of the Universidad César Vallejo (DICTAMEN 131-CEI-EP-UCV-2022).
Informed consent was requested from the participating teachers through a Google form. The form outlined the objective of the interview, the estimated duration, and the voluntary and anonymous nature of participation. Each member of the research team interviewed participants individually. Participants were informed that each interview would be recorded using the Zoom platform and that participants should rename themselves with a pseudonym and turn off their cameras.
Each question was projected onto a slide and read aloud by the interviewer. In instances where the interviewee's answers were not sufficiently elaborate, the interviewer asked specific questions that allowed for more in-depth information to be gathered.
After the interviews were completed, each researcher manually transcribed the content. Subsequently, the research team conducted categorisation during a meeting.
2.2. Results
Based on the university teachers’ experiences, the perceived competences for teaching in virtual environments were categorised as follows: mastery of the subject, technological mastery, professional updating, planning, didactics, knowing how to teach, generating attention, group management, communication, empathy, respect and tolerance, and cognitive flexibility (Table 1). The table of specifications was then structured and the categories obtained were grouped into two dimensions: Self-efficacy in Digital Competences and Self-efficacy in Pedagogical Competences.
Table 1.
Categorisation of teaching competences in virtual environments, as perceived by university teachers.
| Categorisation | Sample of speech fragments |
|---|---|
| Mastery of the subject matter |
|
| Technological expertise |
|
| Professional updating |
|
| Planning (design of sessions and learning resources) |
|
| Didactics (use of methodology, strategies, design of material) |
|
| Knowing how to teach (knowing how to use examples/exposition) |
|
| To generate attention, motivation, and commitment in students |
|
| Group management (discipline) |
|
| Communication |
|
| Empathy |
|
| Respect and tolerance |
|
| Cognitive flexibility |
|
Indicators pertaining to Self-efficacy in Transversal Competences were excluded because they encompass soft skills. Items related to Self-efficacy in Research Competences were also excluded as specific tests are already available to measure this construct.
The categories of knowing how to teach and generating attention, motivation, and commitment in students were integrated into the didactics indicator. Competences reported in the literature, such as security of virtual classes, ethical-legal aspects, guidance, and/or counselling in the use of technological resources were included within the digital generic competences dimension. Within the pedagogical competency dimension are evaluation and feedback (Table 2).
Table 2.
Conceptual structure and definition of the indicators of the hypothetical model competences of university teaching self-efficacy in virtual environments.
| Dimension | Indicator | Argument for inclusion | Items |
|---|---|---|---|
| Self-efficacy in digital competences. Confidence in one's own ability to use and guide the use of basic technological resources, gamification, and security in virtual teaching-learning sessions |
Gamification Confidence in the ability to use virtual recreational resources in a timely and varied manner to encourage students' participation, interaction, and attention |
Category obtained (Table 1), based on the responses of 26 interviewees |
As a university teacher in virtual environments, I AM CONFIDENT THAT I CAN … Use different digital tools (Kahoot, Quizz, Mentimeter, etc.) to maintain participation during the session. Use interactive tools (Genially, Socrative, Mentimeter, etc.) to facilitate students' understanding of a particular topic. Design playful activities to dynamise class sessions. Use interactive tools to strengthen collaborative work among students during learning activities. |
|
Basic technological skills Confidence in the ability to use platforms and programmes that enable the development of teaching-learning activities |
Category obtained (Table 1), based on the responses of 26 interviewees | Develop creative presentations using a variety of virtual tools. Design audiovisual resources (PPTs, videos, audio clips, podcasts, etc.), using various programmes and/or virtual tools. Use office automation packages to implement, publish and share materials (videos, documents, spreadsheets, etc.). Selecting from the Internet, free access programmes, audiovisuals, databases, etc., for the development of the sessions. |
|
|
Safety in virtual classes Confidence in the ability to apply security measures to ensure the protection of the students' data and evaluations, as well as to prevent the intrusion of outsiders (hackers) that alter the development of classes. |
Category derived from theoretical information [78] | Recognise the risks found on the Internet (hackers, spyware, viruses, etc.) that can affect the implementation and development of my classes. Implement actions (waiting rooms, login, password) to protect my class sessions. Execute measures to block intrusions (suspend participants) during the development of the sessions. Configure the evaluations to keep them hidden. Set up exclusive student access to their work material. |
|
|
Ethical and legal aspects Confidence in the ability to design inclusive pedagogical resources, according to the characteristics of the students; in addition, to promote respect for copyright in the production of academic content. |
Category derived from theoretical information [[78], [79]] | Facilitate accessible conditions for learning of students with disabilities. Publish digital resources that contain citations of the sources from which information was extracted. Review material produced by students with anti-plagiarism software. Use each student's information for academic purposes only. |
|
|
Guidance and/or assessment in the use of technological resources. Confidence in the ability to inform and train students in the use of technological resources (platforms, programmes, and tools). |
Category derived from theoretical information [75] Only six teachers considered this category |
Develop tutorials to guide students in the use of digital platforms or tools. Respond to students' doubts regarding the management of digital platforms or tools. Guide to the solution of technological problems presented by students during the sessions.
|
|
| Self-efficacy in pedagogical competences. Confidence in one's own ability to plan and implement didactic strategies, with mastery of subject matter, constant updating, and group management, as well as to design evaluation and feedback methods. |
Planning Confidence in the ability to design learning processes (sessions, material, digital tools), according to the capabilities and competences of an educational model. |
Category obtained (Table 1), based on the responses of 13 interviewees. | Design didactic sequences that promote the inclusion of students with disabilities within the classroom Design didactic sequences that mobilise skills according to the competence of the course. Evaluate the syllabus of an assigned course, according to the curricular plan and the expected professional competence. Prepare updated material, according to the purpose of the sessions and course competence.
|
|
Didactics Confidence in the ability to implement methodologies that facilitate teaching-learning and promote attention, motivation, commitment, and autonomy in students, based on a purpose. |
Category obtained (Table 1), based on the responses of 17 interviewees. | Dynamise the learning space (active pauses, dynamics, narrative of experiences, etc.) to facilitate the teaching-learning process. Apply strategies to gather knowledge and generate new learning, according to the purpose of the session and competence of the course. Use strategies that promote student reflection regarding the learning process. Provide precise instructions (information search, processes, and expected outcome) aimed at achieving academic products. |
|
|
Group management Confidence in the ability to lead students during the development of the sessions in a climate of mutual respect. |
Category obtained (Table 1), based on the responses of 10 interviewees. | Handle conflicts that arise between students during sessions. Guide students in meeting the goals of the session. Involve all students in learning activities. Remain calm in the face of tense situations generated by the students' behaviour.
|
|
|
Mastery of the subject matter Confidence in one's own knowledge and experience of the subject matter to be taught. |
Category obtained (Table 1), based on the responses of 18 interviewees. | Manage up-to-date information on the subject to be taught. Conduct training on the area (or speciality) related to the subject I teach. Participate as an expert or consultant in the area (or speciality) related to the subject I teach. Carry out research related to the contents of the subject I teach.
|
|
|
Evaluation and feedback Confidence in the ability to design instruments, collection, and assessment processes, as well as to provide timely information, based on a learning purpose. |
Category derived from theoretical information [[75], [81]] Only six teachers considered this category. |
Develop instruments to assess student learning. Use a variety of techniques and/or methods to assess students' expected learning. Systematise student assessment results for appropriate decision-making. Adapt evaluation instruments according to the characteristics of the students and the academic product. Inform students of their progress on assessment results. Guide the student to overcome difficulties encountered in the assessment of their abilities. |
Finally, the structure of the self-efficacy scale of perceived competences for university teaching in virtual environments was formed by Self-efficacy in Digital Competences (gamification, basic technological skills, safety in virtual classes, ethical-legal aspects, guidance and/or advice in the use of technological resources), and Self-efficacy in Pedagogical Competences (planning, didactics, group management, subject mastery, and evaluation and feedback). The definitions of the initial indicators and items are listed in Table 2.
2.3. Discussion
Teaching in virtual environments became a widespread containment measure during the COVID-19 health emergency. Its impact has generated changes in university higher education since the development of online courses continued during the post-pandemic period. This required teachers to acquire digital competences to enhance their self-efficacy, which is a relevant construct due to its relationship with performance [[47], [48], [49]], work engagement [50], student enjoyment [53], and student learning engagement and participation [54]. However, no instruments are available to measure teacher self-efficacy in virtual environments. A scale was constructed for this purpose. In the first phase, using a qualitative approach, the objective was to unveil the competences of university teaching in virtual environments based on the experiences of Peruvian teachers.
The categories that served as the basis for the development of the instrument items were derived from grounded theory. This method is recommended for psychometrics [84] and has been reported in other instrumental studies [[85], [86], [87]] because it conveniently combines the strengths of both qualitative and quantitative approaches to obtain in-depth content validity [88,89].
Using content analysis, the dimensions and indicators of Self-efficacy in Transversal Competences and Self-efficacy in Research Competences were excluded. The first dimension refers to soft skills and was excluded based on the finding of an antecedent. In the construction of a self-efficacy scale for pre-professional practices of psychology students, socioemotional items were excluded during factor analysis, because the scale was designed according to the competences expected from psychologists [83]. However, specific measures and theories exist for each soft skill (e.g. cognitive flexibility [90,91], empathy [92,93], tolerance [94], and teacher communication [95]). Therefore, it would be convenient to use independent scales to measure each of these skills.
The dimension of Self-Efficacy in Investigative Competences was excluded from the scale structure because specific tests are already available to measure that construct [37,96]. Furthermore, a study that included the structure of the self-efficacy scale in the professional praxis of psychology practitioners [83] found the absence of a general latent factor integrating self-confidence in research ability and self-confidence in different performance dimensions. This is reflected in the gap between professional praxis and basic science. This justified the use of independent scales for each construct.
The dimensions identified from the interviews included gamification, basic technological skills, safety in virtual classes, ethical-legal aspects, guidance and/or advice on the use of technological resources, all of which are included within digital competences. On the other hand, planning, didactics, group management, mastery of the subject, and evaluation and feedback were considered within pedagogical competence.
The dimensions extracted not only stem from the experiences of university teachers but are also supported by several studies. The massive incorporation of technology into educational processes in recent times demands changes in teaching methodology [97] that are innovative, maintain motivation [98], and promote respect, discipline, and self-determination [99], especially in the virtual environment.
Within digital competences, the dimensions of basic technological skills [100] and the skills to handle information and communication technologies [76,78] were considered. Gamification was separated as a dimension because it is a playful learning methodology requiring additional technological skills [101]. Additionally, the digitisation of daily life implies greater student participation in active, dynamic, and innovative technological learning strategies [97,102]. This is a challenge in virtual education because creativity is required to maintain students' attention and motivation [103]. During the COVID-19 pandemic, difficulty in motivating students to actively participate in the sessions was reported [69].
In the process of structuring self-efficacy towards digital competences, dimensions linked to safety and ethical-legal aspects were also included [78]. The use of virtual environments involves the exposure of personal information (name, ID, and phone number) and interactions with strangers, requiring the mastery of skills to prevent risks and protect users [104]. Additionally, virtual education provides students with more freedom [99], thus teachers must promote ethical behaviour, prevent plagiarism and fraud, and maintain control over the content [105]. Finally, within digital competence, the dimension of guidance and/or advice on the use of technological resources was incorporated. This dimension is associated with teachers’ ability to facilitate learning through technological skills in their students [70].
Pedagogical competence refers to the teacher's mastery of mesh theory and practice, giving meaning to what they teach and creating conditions for learning in an equitable and reflective sense [106]. This is coherent with an educational model that determines the graduate profile of competent professionals [76]. Five competence dimensions were identified. The first is planning, which sets the course of the formative task, involves the designing of the teaching-learning process, and is essential for achieving the expected results [107,108]. The second dimension is didactics, encompassing the ability to select, adapt, or create specific, pertinent, and innovative strategies to optimise teachers' teaching and students' learning [[108], [109], [110]].
The third dimension, group management, is an important element in maintaining an adequate classroom climate according to the teachers interviewed. Compliance with orientations, prevention of conflicts, and establishing clear expectations reduce distractions and help focus on learning purposes [108,111]. The fourth dimension is mastery of the subject, which entails possessing disciplinary expertise, and staying updated and trained in coherence with the subject being taught [112], which allows the teacher to impart solid and quality knowledge. Finally, the fifth dimension of pedagogical competence is evaluation and feedback, which involves designing processes to assess learning [76], employing effective teaching methods or strategies [113], making relevant decisions, and providing educational quality.
Identifying emerging categories of teaching self-efficacy in virtual environments is relevant because it serves as a basis for instrumental studies by deepening and reaffirming theories. In addition, considering the perspective of the target population overcomes researchers’ biases in the process of constructing the instrument. This categorisation reveals elements that could inform teacher training programmes according to the competences expected of teachers in the digital era.
3. Study 2
After constructing the preliminary version of the instrument, based on the emerging dimensions and theory, it was necessary to obtain scientific evidence of the correspondence between the content of the scale and the construct of teaching self-efficacy in virtual environments, which it is intended to measure. For this purpose, it was considered propitious to obtain evidence of content validity through expert judgement to determine the representativeness of the items, the relevance of each part of the test in relation to the construct [114], and diversity, clarity, and comprehensibility of the items [115]. In addition, scientific evidence indicates the importance of assessing the clarity of a test by the people to whom the instrument will be addressed [116]. Therefore, the objective is to obtain evidence of content validity of the SSUTVE. For this purpose, it is considered relevant not only from the perspective of the judges but also that of the target population of the test in the Peruvian context.
3.1. Materials and methods
3.1.1. Participants
Ten judges, selected by non-probabilistic convenience sampling, participated in evaluating the content of the instrument and were contacted by the researchers via email. The inclusion criterion was that they had experience in virtual teaching, prior to the COVID-19 pandemic. The majority were psychologists with doctoral degrees (70 %), and males and females (50 % each). Fifty percent were university teachers at the undergraduate level, 40 % at both undergraduate and graduate levels, and 10 % at the postgraduate level. Their minimum university teaching experience was five years (M = 12.4, SD = 6.54), with three years of experience in virtual environments (M = 5.8, SD = 1.81), synchronous and asynchronous. Most of them worked in private universities (80 %, 8). Judges evaluated the items according to the criteria of coherence, relevance and clarity, on a scale from 1 (not at all coherent/relevant/clear) to 5 (fully coherent/relevant/clear). In addition, the judges responded, by dimension, to the question whether the number of items was sufficient to represent the dimension and had the option to suggest the inclusion of any missing items or content. Participation was anonymous, voluntary and without compensation of any kind.
In addition, 10 university teachers, who were part of the target population of the study, rated the clarity of the items, because the assessment of the examinees does not necessarily coincide with that of the judges [116]. Participants were selected by non-probabilistic convenience sampling and their participation was voluntary, anonymous and without compensation for their time. Between the ages of 35 and 61 (M = 47.1; SD = 10.24) 70 % of whom were women. The participants had been teaching in virtual environments, in the synchronous mode, since the pandemic; however, they had previous experience of working in university teaching.
3.1.2. Instrument
The guidelines established by Bandura [7] for constructing self-efficacy scales were followed. In addition, the emerging categories identified after qualitative analysis were considered, and the instrument was designed considering Bandura's social cognitive theory [9,42].
A scale was designed in a template that allows evaluation based on coherence, clarity, and sufficiency. The instructions are detailed below.
"On behalf of our research team, I cordially welcome you. In the context of the pandemic, teachers were faced with the need to develop the competences to carry out their work in virtual environments. Therefore, considering that self-efficacy is an important predictor of performance, we are constructing an instrument that measures self-efficacy for virtual teaching in university teachers.
Therefore, we ask you to please evaluate the items in terms of their consistency with the dimension and clarity, highlighting the assessment you consider most relevant. You will find each dimension in a different tab, in this same [Excel] file. Likewise, if you consider it necessary, you could point out any comments in this same format, as well as the suggestion of potential items that you could kindly suggest to us."
For each item, the judges checked a box between 1 (Not at all coherent/not at all clear) and 5 (Totally coherent/totally clear). The participating teachers had only the option to rate items according to the criterion of clarity. In addition, the judges could comment on the instructions and response alternatives of the scale.
3.1.3. Procedure
As mentioned above, the project obtained the approval of the Ethics Committee of the School of Psychology of the Universidad César Vallejo (DICTAMEN 131-CEI-EP-UCV-2022).
An e-mail containing the scoring template of the instrument was sent to the judges. Prior to this, they were informed about the research objectives, the estimated time required to complete the scale, and the anonymous, confidential, and voluntary nature of their participation. The researchers' e-mail addresses were also provided, enabling participants to seek clarification if needed.
Subsequently, university teachers (the potential study population) were requested to qualitatively and quantitatively evaluate the items based on their clarity.
The judges’ comments were considered, and redundant items were eliminated to ensure the diversity of item content.
3.1.4. Data analysis
Aiken's V [117] and confidence intervals [118] were applied to process the judges' ratings. Items with a lower limit of the confidence interval >.70 were accepted [119].
3.2. Results
The findings on the coherence and clarity of the items obtained adequate values for the dimensions of self-efficacy in digital competences (Min. V = .88 [.89, .99]), and self-efficacy in pedagogical competences (Min. V = .88 [.89, .99]). Despite this, 12 were eliminated, 8 were modified, and 25 items were maintained in response to the observations of the judges and target population evaluated (Table 3).
Table 3.
Valuation of items with Aiken's V.
| Indicator | Original items | V of Aiken (95%CI) |
Decision | Items (version for evaluation) | |
|---|---|---|---|---|---|
| Coherence | Clarity | ||||
| Self-efficacy in DIGITAL COMPETENCES | |||||
| Gamification |
|
.98 [.89, .99]a | .93 [.82, .97]a .95 [.85, .98]b |
It is maintained | 1. Use different digital tools (Kahoot, Quizz, Mentimeter, etc.) to maintain participation during the session. |
|
.95 [.85, .98]a | .95 [.85, .98]a | Deleted to avoid redundancy | ||
|
.98 [.89, .99]a | .95 [.85, .98]a 1 [.93, .99]b |
It is maintained | 2. Design playful activities to energise class sessions. | |
|
1 [.93, .99]a | 1 [.93, .99]a .90 [.79, .96]b |
Deleted to avoid redundancy | ||
| Basic technological skills |
|
.98 [.89, .99]a | .98 [.89, .99]a 1 [.93, .99]b |
It is maintained | 3. Develop creative presentations using a variety of virtual tools. |
|
.98 [.89,.99]a | .98 [.89, .99]a .98 [.89, .99]b |
It is maintained | 4. Design audiovisual resources (PPTs, videos, audio clips, podcasts, etc.), using various programmes and/or virtual tools. | |
|
1 [.93, .99]a | 1 [.93, .99]a 1 [.93, .99]b |
It is maintained | 5. Use office automation packages to implement, publish, and share materials (videos, documents, spreadsheets, etc.). | |
|
.95 [.85, .98]a | .98 [.89, .99]a .98 [.89, .99]b |
It is maintained | 6. Select from the Internet free access programmes, audiovisuals, databases, etc., for the development of the sessions. | |
| Safety in virtual classrooms |
|
1 [.93, .99]a | 1 [.93, .99]a 1 [.93, .99]b |
It is maintained | 7. Recognise the risks found on the Internet (hackers, spyware, viruses, etc.) that can affect the implementation and development of my classes. |
|
.95 [.85, .98]a | .98 [.89, .99]a 1 [.93, .99]b |
It is maintained | 8 Implement actions (waiting rooms, login, password) to protect my class sessions. | |
|
.95 [.85, .98]a | .98 [.89, .99]a 1.93, .99]b |
It is maintained | 9. Execute measures to block intrusions (suspend participants) during the development of the sessions | |
|
.88 [.89, .99]a | .88 [.89, .99]a | Deleted for not representing the dimension | ||
|
.90 [.79, .96]a | .90 [.79, .96]a | Deleted because, the action is included in item 2 | ||
| Ethical and legal aspects |
|
.93 [.82, .97]a | .93 [.82, .97]a .93 [.82, .97]b |
Deleted for not representing the dimension | |
|
1 [.93, .99]a | 1 [.93, .99]a 1 [.93, .99]b |
It is maintained | 10. Publish digital resources that contain citations of the sources from which information was extracted. | |
|
.98 [.89, .99]a | .98 [.89, .99]a 1 [.93, .99]b |
It is maintained | 11. Review material produced by students with anti-plagiarism software | |
|
1 [.93, .99]a | 1 [.93, .99]a | Deleted for not representing the dimension | ||
| Guidance and/or advice on the use of technological resources |
|
1 [.93, .99]a | 1 [.93, .99]a | Deleted for little relevance | |
|
1 [.93, .99]a | 1 [.93, .99]a 1 [.93, .99]b |
Modified | 12. Respond to students' doubts regarding the use of digital platforms or tools, within the deadlines established by the institution. | |
|
1 [.93, .99]a | 1 [.93, .99]a 1 [.93, .99]b |
It is maintained | 13. Guide to the solution of technological problems presented by the students, during the sessions. | |
|
1 [.93, .99]a | 1 [.93, .99]a 1 [.93, .99]b |
It is maintained | 14. Develop guides or instructions for the optimal use of digital platforms, tools, or programmes. | |
| Self-efficacy in PEDAGOGICAL COMPETENCES | |||||
| Planning |
|
1 [.93, .99]a | .95 [.85, .98]a 1 [.93, .99]b |
It is maintained | 15. Design didactic sequences that promote the inclusion of students with disabilities in the classroom. |
|
.98 [.89, .99]a | .98 [.89, .99]a .98 [.89, .99]b |
It is maintained | 16. Design didactic sequences that mobilise skills according to the course competence. | |
|
.90 [.89, .99]a | 1 [.93, .99]a .98 [.89, .99]b |
Modified | 17. Evaluate the syllabus of an assigned course, according to the curricular plan and the expected professional competence. | |
|
.98 [.89, .99]a | .98 [.89, .99]a .98 [.89, .99]b |
It is maintained | 18. Prepare updated material, according to the purpose of the sessions and course competence. | |
|
.98 [.89, .99]a | .93 [.82, .97]a .98 [.89, .99]b |
It is maintained | 19. Adapt, if necessary, methodological strategies according to the learning needs and interests of the students, the demands of the environment and the competence of the course. | |
| Didactics |
|
.98 [.89, .99]a | .98 [.89, .99]a 1 [.93, .99]b |
It is maintained | 20. Dynamise the learning space (active pauses, dynamics, narration of experiences, etc.) to facilitate the teaching-learning process. |
|
.98 [.89, .99]a | .98 [.89, .99]a .98 [.89, .99]b |
It is maintained | 21. Apply strategies to gather knowledge and generate new learning, according to the purpose of the session and competence of the course. | |
|
.98 [.89, .99]a | .98 [.89, .99]a .98 [.89, .99]b |
It is maintained | 22. Use strategies that promote student reflection on the learning process. | |
|
.98 [.89, .99]a | .98 [.89, .99]a 1 [.93, .99]b |
It is maintained | 23. Provide precise instructions (information search, processes and expected outcome) aimed at achieving academic products. | |
| Group management |
|
.88 [.76, .94]a | .88 [.76, .94]a 1 [.93, .99]b |
It is maintained | 24. Handle conflicts that arise among students, during the sessions. |
|
1 [.93, .99]a | 1 [.93, .99]a 1 [.93, .99]b |
Modified. Suggestion of participants. | 25. Direct students in meeting the goals of the session. | |
|
1 [.93, .99]a | 1 [.93, .99]a 1 [.93, .99]b |
It is maintained | 26. Involve all students in learning activities | |
|
.88 [.76, .94]a | .95 [.85, .98]a | Deleted to avoid redundancy | ||
|
.95 [.85, .98]a | .95 [.85, .98]a | Deleted for not representing the dimension | ||
| Subject- Matter Mastery |
|
.88 [.76, .94]a | .88 [.76, .94]a 1 [.93, .99]b |
Modified. Suggestion of participants. | 27. Manage up-to-date information on the subject matter to be taught. |
|
.85 [.73, .92]a | .95 [.85, .98]a | Deleted for little relevance | ||
|
.98 [.89, .99]a | .98 [.89, .99]a 1 [.93, .99]b |
It is maintained | 28. Participate as an expert or consultant in the area (or speciality) related to the subject I teach. | |
|
.88 [.76, .94]a | .98 [.89, .99]a | Deleted because related to another indicator | ||
|
.90 [.79, .96]a | 1 [.93, .99]a 1 [.93, .99]b |
It is maintained | 29. Write journalistic notes or articles on topics of my specialty. | |
| Evaluation and feedback |
|
.98 [.89, .99]a | .98 [.89, .99]a | Modified | 30 Develop instruments to assess student learning. |
|
.98 [.89, .99]a | .98 [.89, .99]a | Deleted for redundancy with previous item | ||
|
.98 [.89, .99]a | .98 [.89, .99]a | Deleted because, in the university environment, the systematisation process in virtual teaching is consolidated by the digital platforms to be managed (Canvas, Blackboard, etc.). | ||
|
.95 [.85, .98]a | .95 [.85, .98]a 1 [.93, .99]b |
Modified | 31 Adapt evaluation instruments, according to the characteristics of the students and the academic product. | |
|
.98 [.89,.99]a | .98 [.89, .99]a 1 [.93, .99]b |
Modified | 32. Feedback to students on their progress on assessment results. | |
|
1 [.93, .99]a | 1 b[.93, .99]a 1 [.93, .99]b |
Modified | 33. Provide personalised feedback to students at academic risk to help them overcome their difficulties. | |
Note.
Judges' evaluation.
Participating teachers' evaluation.
3.3. Discussion
Prior to the COVID-19 pandemic, university teaching primarily occurred face-to-face. Therefore, it would be expected that self-efficacy is linked to pedagogical competences. However, the new social demands driven by the health emergency required the development of digital competences in teachers. In this context, the content domain of the SSUTVE was constructed and specified based on emerging categories found through qualitative methodology, derived from the perceptions of Peruvian university teachers, as reported in Study 1. Thus, the objective was to obtain evidence of the content validity of the SSUTVE through expert judgement.
The qualitative and quantitative analyses applied in Studies 1 and 2 are recommended for constructing psychological tests [114,115]. This approach enabled the identification of 10 factors and 33 items that constituted the initial structure of the SSUTVE. These factors and items were accepted by both the judges for their theoretical coherence and clarity, and by the target population for their clarity. The factors included gamification, basic technological skills, safety in virtual classes, ethical-legal aspects, guidance, and/or advice in the use of technological resources (related to self-efficacy in digital competences), and planning, didactics, group management, subject mastery, and evaluation and feedback (related to self-efficacy in pedagogical competences).
In this study, the perceptions of both the judges [114] and the target population [116] regarding item clarity were similar, given the similarity in the characteristics of both groups. Previous studies have found that the assessment of item clarity by the target population does not always coincide with that of judges [82,116]. Therefore, the agreement observed in this report enhances the robustness of the instrument. The assessment of item wording and test format complemented the evaluation of the SSUTVE content [114].
4. Study 3
To ensure the quality of psychological assessment instruments, it is necessary to guarantee that the interpretations derived from them are valid, enabling their use as intended [114]. Thus, with the SSUTVE, which has been approved by the judges and target population, it is necessary to empirically test the instrument to determine its internal structure [114,115]. Therefore, the objectives are to determine the evidence of validity based on the internal structure of the SSUTVE and to determine its reliability.
4.1. Materials and methods
4.1.1. Participants
The sample, selected through non-probabilistic intentional sampling, consisted of 554 Peruvian university teachers who taught in the virtual mode due to the COVID-19 pandemic or the design of their academic programme. Sixty-two percent (342) were male, aged between 25 and 75 years (M = 49.4; SD = 10), and 87 % taught at the undergraduate level (482). In total, 83.57 % (463) worked at a private university, 2.53 % (14) at a state university, and 13.9 % (77) at both. The teachers belonged to the field of education (22 %), followed by economics (18 %), technological sciences (16 %), psychology (12 %), among others (Table 4). The average teaching experience was 10.2 years (SD = 7.87) and 3.21 years (SD = 2.76) in virtual environments. Participation was voluntary, anonymous and without compensation of any kind.
Table 4.
Profession of the study participants.
| Science and technology nomenclature.a | Profession | f | % |
|---|---|---|---|
| Agricultural sciences | Environmental scientist | 1 | .18 |
| Zootechnician | 1 | .18 | |
| Sciences of life | Biologist | 7 | 1.26 |
| Microbiologist | 2 | .36 | |
| Sciences of arts and humanities | Architect | 9 | 1.62 |
| Economic sciences | Administrator | 59 | 10.65 |
| Accountant | 39 | 7.04 | |
| Juridical sciences and law | Lawyer | 51 | 9.21 |
| Medical sciences | Sports scientist | 1 | .18 |
| Nurse | 17 | 3.07 | |
| Medical doctor | 19 | 3.43 | |
| Nutritionist | 3 | .54 | |
| Obstetrician | 2 | .36 | |
| Odontologist | 4 | .72 | |
| Technological sciences | Engineer | 88 | 15.88 |
| Physics | Physicist | 6 | 1.08 |
| Mathematics | Economist | 24 | 4.33 |
| Statistician | 4 | .72 | |
| Pedagogy | Educator | 122 | 22.02 |
| Translator and interpreter | 2 | .36 | |
| Psychology | Psychologist | 69 | 12.46 |
| Chemistry | Chemist - pharmacist | 4 | .72 |
| Sociology | Communication scientist | 15 | 2.71 |
| Social scientist | 3 | .54 | |
| Tourism professional | 2 | .36 | |
| Total | 554 | 100 |
The nomenclature corresponds to the United Nations Educational, Scientific and Cultural Organisation. (https://skos.um.es/unesco6/view.php?fmt=1).
4.1.2. Instrument
The SSUTVE was administered after being corrected based on evaluation by judges and the target population (Study 2). The instrument consisted of 61 items.
The instructions state: “Below you will find situations that university teachers find difficult when teaching in virtual environments. Indicate how confident (a) you feel about performing each of the following activities, choosing a number between 1 and 10, according to the following scale: 1 ‘Not at all I can do it’, 5 ‘Moderately confident that I can do it’, and 10 ‘Totally confident that I can do it’.
For example: “As a university teacher in virtual environments, I am confident that I can …
… develop tutorials to guide students in the use of digital platforms.”
If a participant feels somewhat confident that they can do it, they would mark the number 3. Conversely, if they feel almost sure that they can do it, they would select the numbers 8 or 9 (See appendix).
4.1.3. Procedure
The study employed an instrumental design [120]. Approval for the project was obtained (DICTAMEN 131-CEI-EP-UCV-2022, Resolution of the Vice-Rectorate for Research No. 457-2022-VI-UCV). Permission was also granted by the Vice-Rectorate for Research of a private Peruvian university with campuses in different cities across the country (Oficio Múltiple No. 169-2022-VI-UCV). This department managed the dissemination of the survey link by institutional e-mail, allowing willing teachers to participate using a Google form, after providing informed consent. The scale was also disseminated through social networks, WhatsApp messaging, and emails to teachers from other Peruvian universities who agreed to participate.
4.1.4. Data analysis
4.1.4.1. Estimation and software
To analyse the internal structure of the instrument, confirmatory factor analysis (CFA) was used with the weighted least squares method of estimation with mean and variance adjusted (WLSMV), considering the matrix of polychoric correlations. These procedures were performed using the Mplus v. 7 software [121].
4.1.4.2. Evidence of validity in relation to the internal structure
Measurement models. A measurement model of 10 oblique factors was analysed based on a review of the available literature, after which content analysis was carried out (Studies 1 and 2).
Preliminary analysis. Before conducting the CFA, we analysed whether the items had a reasonable approximation to univariate normality, which was assessed according to the magnitude of asymmetry and kurtosis; specifically, if they were between −1.5 and 1.5 [122]. Multivariate normality was analysed using the multivariate kurtosis coefficient (G2) of Mardia [123], expecting values less than 70 [124].
Evaluation of the measurement models. The proposed measurement model was evaluated based on three criteria. First, the magnitude of various fit indexes, such as CFI (>.90 [125]), RMSEA, whose confidence interval (CI) must have an upper bound less than .10 [126], and WRMR (<1 [127]). The second criterion considered the influence of the latent variable on the items, focusing on the magnitude of the factor loadings (>.50 [128]).
The third criterion considered two aspects. First, the average variance extracted per factor (AVE; >.37 [129]) was assessed as an indicator of convergent internal validity. AVE represents the average of the communalities of the items and indicates the factor's presence in the measure. Second, the degree of discriminant internal validity among factors was assessed through the descriptive examination of the interfactor correlations (ϕ). Values greater than .85 indicate a low empirical difference between these dimensions [130]. Finally, to conclude whether factors can be interpreted separately, the AVE of a factor is expected to be greater than the squared interfactor correlation between two factors (ϕ2; shared variance among factors) [131,132].
4.1.5. Reliability
The reliability of the scores was assessed with Cronbach's α coefficient (>.70 [133]). Construct reliability was estimated with the ω coefficient (>.70 [134]).
4.2. Results
4.2.1. Evidence of validity in relation to internal structure
4.2.1.1. Preliminary analysis
In terms of description, although most of the items showed acceptable skewness, the kurtosis significantly exceeded the proposed limits (Table 5). In addition, Mardia's coefficient suggests that the data did not approach multivariate normality (G2 = 930.636).
Table 5.
Descriptive statistics of the items.
| M | DE | g1 | g2 | M | DE | g1 | g2 | ||
|---|---|---|---|---|---|---|---|---|---|
| Item 1 | 8.132 | 2.086 | −1.197 | .727 | Item 18 | 8.718 | 1.527 | −1.645 | 3.467 |
| Item 2 | 7.946 | 2.078 | −1.076 | .451 | Item 19 | 8.986 | 1.261 | −1.638 | 3.565 |
| Item 3 | 8.338 | 1.813 | −1.303 | 1.291 | Item 20 | 8.810 | 1.358 | −1.516 | 2.979 |
| Item 4 | 8.527 | 1.669 | −1.314 | 1.358 | Item 21 | 8.865 | 1.281 | −1.410 | 2.298 |
| Item 5 | 8.469 | 1.795 | −1.253 | 1.005 | Item 22 | 8.865 | 1.302 | −1.486 | 2.814 |
| Item 6 | 8.606 | 1.588 | −1.293 | 1.308 | Item 23 | 8.894 | 1.289 | −1.535 | 3.092 |
| Item 7 | 8.078 | 2.034 | −1.032 | .301 | Item 24 | 8.897 | 1.255 | −1.468 | 2.585 |
| Item 8 | 8.747 | 1.574 | −1.486 | 1.847 | Item 25 | 9.119 | 1.197 | −2.016 | 5.575 |
| Item 9 | 8.522 | 1.726 | −1.441 | 1.879 | Item 26 | 9.199 | 1.076 | −1.792 | 4.266 |
| Item 10 | 8.543 | 1.718 | −1.416 | 1.711 | Item 27 | 9.119 | 1.083 | −1.689 | 4.041 |
| Item 11 | 8.294 | 1.876 | −1.170 | .783 | Item 28 | 9.186 | 1.069 | −1.740 | 4.122 |
| Item 12 | 8.588 | 1.596 | −1.388 | 1.863 | Item 29 | 8.944 | 1.346 | −1.632 | 2.947 |
| Item 13 | 8.415 | 1.713 | −1.279 | 1.379 | Item 30 | 8.287 | 1.868 | −1.199 | .977 |
| Item 14 | 7.940 | 2.035 | −1.015 | .491 | Item 31 | 8.821 | 1.396 | −1.565 | 3.223 |
| Item 15 | 7.848 | 2.110 | −1.040 | .580 | Item 32 | 8.874 | 1.380 | −1.653 | 3.620 |
| Item 16 | 8.457 | 1.630 | −1.318 | 1.910 | Item 33 | 9.112 | 1.182 | −1.753 | 3.951 |
Note: M: Mean; SD: Standard deviation; g1: Asymmetry; g2: Kurtosis.
4.2.1.2. Evaluation of measurement models
The 10-factor oblique model obtained acceptable fit indexes (CFI = .965; RMSEA = .104, CI90 % .100, .107; WRMR = 1.647). However, the interfactor correlation between the first original factor (gamification) and the second (basic technological skills) exceeded unity. Therefore, it was decided to merge them into a factor called basic technological skills. Subsequently, the new nine-factor oblique model obtained favourable fit indexes (CFI = .973; RMSEA = .090, CI 90 % .087, .093; WRMR = 1.433), and factor loadings exceeded expectations in all cases (λrange = .798 - .963) (Table 6).
Table 6.
Factorial parameters of the oblique model.
| Self-efficacy in digital competences | Self-efficacy in pedagogical competences | ||
|---|---|---|---|
| F1 | F5 | ||
| Item 1 | .798 | Item 15 | .791 |
| Item 2 | .807 | Item 16 | .888 |
| Item 3 | .896 | Item 17 | .869 |
| Item 4 | .865 | Item 18 | .910 |
| Item 5 | .849 | Item 19 | .918 |
| Item 6 | .915 | F6 | |
| F2 | Item 20 | .922 | |
| Item 7 | .812 | Item 21 | .936 |
| Item 8 | .912 | Item 22 | .935 |
| Item 9 | .869 | Item 23 | .917 |
| F3 | F7 | ||
| Item 10 | .877 | Item 24 | .916 |
| Item 11 | .791 | Item 25 | .963 |
| F4 | Item 26 | .939 | |
| Item 12 | .931 | F8 | |
| Item 13 | .924 | Item 27 | .957 |
| Item 14 | .882 | Item 28 | .853 |
| Item 29 | .805 | ||
| F9 | |||
| Item 30 | .949 | ||
| Item 31 | .932 | ||
| Item 32 | .959 | ||
| Item 33 | .901 | ||
Note: F1: Basic technological skills; F2: Safety in virtual classes; F3: Ethical-legal aspects; F4: Guidance and/or advice on the use of technological resources; F5: Planning; F6: Didactics; F7: Group management; F8: Mastery of the subject; F9: Evaluation and feedback.
Regarding convergent internal validity, the AVE exceeded the expected value, with the minimum value almost doubling the minimum acceptable value (.697). On the other hand, for internal discriminant validity, some cases suggested overlap among factors (ϕ > .85). However, in most cases, the AVE exceeded the ϕ2, providing evidence in favour of the empirical difference among dimensions and allowing for an independent interpretation of each of the scores (Table 7).
Table 7.
Convergent and discriminant internal validity.
| F1 | F2 | F3 | F4 | F5 | F6 | F7 | F8 | F9 | |
|---|---|---|---|---|---|---|---|---|---|
| AVE | .733 | .749 | .697 | .833 | .768 | .860 | .883 | .764 | .875 |
| F1 | 1.000 | .726 | .733 | .750 | .706 | .650 | .472 | .521 | .543 |
| F2 | .852 | 1.000 | .857 | .780 | .733 | .663 | .552 | .552 | .546 |
| F3 | .856 | .926 | 1.000 | .845 | .826 | .757 | .548 | .672 | .627 |
| F4 | .866 | .883 | .919 | 1.000 | .854 | .702 | .516 | .534 | .566 |
| F5 | .840 | .856 | .909 | .924 | 1.000 | .885 | .701 | .764 | .755 |
| F6 | .806 | .814 | .870 | .838 | .941 | 1.000 | .812 | .780 | .785 |
| F7 | .687 | .743 | .740 | .718 | .837 | .901 | 1.000 | .927 | .817 |
| F8 | .722 | .743 | .820 | .731 | .874 | .883 | .963 | 1.000 | .837 |
| F9 | .737 | .739 | .792 | .752 | .869 | .886 | .904 | .915 | 1.000 |
Note: F1: Basic technological skills; F2: Safety in virtual classes; F3: Ethical-legal aspects; F4: Guidance and/or advice on the use of technological resources; F5: Planning; F6: Didactics; F7: Group management; F8: Subject mastery; F9: Evaluation and feedback; AVE: average variance extracted; below the diagonal: interfactor correlations (ϕ); above the diagonal: variance shared between factors (ϕ2).
4.2.1.3. Reliability
In all cases (except for the ethical-legal aspects factor), the reliability coefficients for both scores and constructs, were excellent (Table 8).
Table 8.
Reliability.
| α | ω | |
|---|---|---|
| F1 | .913 | .943 |
| F2 | .849 | .899 |
| F3 | .737 | .821 |
| F4 | .891 | .937 |
| F5 | .884 | .943 |
| F6 | .938 | .961 |
| F7 | .919 | .958 |
| F8 | .823 | .906 |
| F9 | .921 | .966 |
Note: F1: Basic technological skills; F2: Safety in virtual classes; F3: Ethical-legal aspects; F4: Guidance and/or advice on the use of technological resources; F5: Planning; F6: Didactics; F7: Group management; F8: Mastery of the subject; F9: Evaluation and feedback.
5. Discussion
With a preliminary structure of the SSUTVE established through categorisation obtained from grounded theory and judges' assessments, it was necessary to determine the underlying factors, such as the structure of the test, from the scale scores. The procedure followed adhered to the recommendations for the construction of psychological assessment instruments [114,115], from the theoretical foundation to the evaluation of the items using different methodologies (expert judgment, factor analysis, etc.). Therefore, the objective of this study was to establish evidence of validity based on the internal structure of the SSUTVE and to assess its reliability. To this end, this discussion will integrate the findings of the three studies.
The structure of the SSUTVE in a sample of Peruvian university teachers working in the virtual modality comprised 33 items distributed across nine correlated factors and demonstrated adequate internal structure, convergent validity, and discriminant internal validity. These factors confirmed the theoretical expectations and findings reported in Study 1, which utilised grounded theory methodology. In Study 2, where the judges and target population were involved, some items from the initial set were eliminated due to various reasons (e.g. lack of representativeness), compromising the construct's representativeness. This resulted in dimensions with four, three, or two items. Subsequently (Study 3), the first two dimensions (gamification and basic technological skills) were merged due to high interfactorial correlation. This could be attributed to the current trend of integrating gamification into teaching to address students' learning needs [135], such as maintaining motivation and attention in virtual environments [69,97,102,103]. Therefore, gamification constitutes an element of teachers' basic technological tools.
The SSUTVE factors are based on the teachers' experiences (Study 1) and confirm the theoretical expectations: basic technological skills [76,78,100,103], safety in virtual classes [78], ethical-legal aspects [104,105], guidance and/or advice in the use of technological resources [80], planning [107,108], didactics [[108], [109], [110]], group management [108,111], subject mastery [112], and evaluation and feedback [76,113]. It is important to mention that although the construction of a new scale was reported, it was decided to start with a CFA because the dimensions were theoretically specified in previous studies (Studies 1 and 2), which determined the items belonging to each dimension [[136], [137], [138]].
The factors related to self-efficacy in pedagogical competences of the SSUTVE were similar to those of the Collective Self-efficacy Scale [33]. They coincide with some factors: planning and design of the teaching-learning process, active involvement in student learning, assessment of student learning, and the teaching role. However, the SSUTVE additionally includes group management and subject matter mastery. Conversely, the Collective University Teaching Self-Efficacy Scale contains factors such as interaction and creation of a positive climate in the classroom, department, and faculty, research and publication of scientific knowledge, as well as professional and pedagogical training and updates. We consider that part of the training and updating dimension is immersed in the content of some items of the SSUTVE subject matter mastery factor. However, as previously noted, the Higher Education Research Self-Efficacy Scales require a separate measure.
Regarding the factors of the SSUTVE related to self-efficacy in digital competences, they share similarities with those of a previous instrumental study that measured the ICT competences of future teachers [40]. These dimensions include basic technological skills and guidance and/or advice in the use of technological resources. However, the antecedent study did not include the factors considered in this research, such as safety in virtual classes and ethical-legal aspects, which are indispensable for information protection and for student training in academic integrity.
The reliability of eight of the nine factors of the SSUTVE was excellent (>.85 [133]), and the self-efficacy factor of the ethical-legal aspects was acceptable (>.70 [133]). This result highlights the scale's good internal consistency and low variability due to errors. However, further studies are necessary to conduct test-retest measures to estimate the stability of the scores.
As the SSUTVE is a recently constructed instrument, obtaining new evidence of validity and reliability in populations from other contexts is essential. Additionally, it is necessary to perform other procedures such as measurement invariance, considering factors like sex. Despite the diversity of the professions of the participants in study 3, further research could include the assessment of the content of the Scale by teachers in virtual environments, who belong to professional fields not represented in the sample of study 3, such as: Philosophy, Ethics, History, Astronomy and Astrophysics, Earth and Space Sciences.
Another limitation of this study was its use of non-probabilistic sampling, which restricts external validity. Similarly, regarding internal structure, developing a study that implements hierarchical models could help clarify the nature of the association among the dimensions [139]. On the other hand, diversifying the strategies to provide evidence of validity would be beneficial. Using external measures that assess related or theoretically relevant constructs (e.g. teacher self-efficacy) could be convenient because, although the structural configuration (internal structure) is important, understanding how the assessed construct is related to others is equally important [140].
As a practical implication, the SSUTVE can serve as a diagnostic measure for the design of plans to improve teaching competence in virtual environments, which will contribute to strengthening teacher performance and, therefore, also teacher's self-efficacy. Similar to a previous study [141], a hybrid teaching programme strengthened pre-service teachers' self-efficacy in instructional strategies, classroom management and student participation. Therefore, the teachers' belief of mastery of their performance in the teaching-learning process in virtual environments will allow them to identify what they need to improve in order to support students in achieving academic objectives. This last aspect is relevant because relationships have been documented between self-efficacy, ease of use of technologies, attitude [1], effort and persistence [7,8]. Despite this, it is suggested that the application of the SSUTVE be accompanied by a measurement of teaching competences in virtual environments, which, in addition to providing evidence of convergent validity, will enrich the information collected for planning training.
Finally, strengthening teaching self-efficacy would contribute to the perception of ease of use of technologies, positive attitudes, and the intention to continue using virtual tools [66] in the post-pandemic period. This could facilitate international virtual exchange in higher education, foster the development of intercultural awareness, and promote skills and knowledge of a global ecology [4].
6. Conclusions
The SSUTVE presents a multidimensional structure of nine correlated factors and 33 items that adequately represent the domain of the construct. The content-based evidence obtained from qualitative and quantitative methodologies, as well as the internal structure, convergent and discriminant internal validity, and reported reliability support the theoretical expectations. Thus, the scale constitutes a diagnostic measure for university teachers in virtual environments and contributes to identifying strengths and potential training areas for university systems.
CRediT authorship contribution statement
Gina Chávez-Ventura: Writing – review & editing, Writing – original draft, Supervision, Resources, Project administration, Methodology, Investigation, Formal analysis, Conceptualization. Tania Polo-López: Writing – original draft, Visualization, Validation, Software, Resources, Methodology, Investigation, Conceptualization. Lilia Zegarra-Pereda: Writing – original draft, Visualization, Validation, Resources, Methodology, Investigation, Conceptualization. Orlando Balarezo-Aliaga: Writing – original draft, Visualization, Validation, Resources, Methodology, Investigation, Conceptualization. Candy Calderón-Valderrama: Writing – original draft, Visualization, Validation, Resources, Methodology, Investigation, Conceptualization. Sergio Dominguez-Lara: Writing – review & editing, Validation, Supervision, Software, Methodology, Formal analysis, Data curation.
Informed consent statement
Informed consent was obtained from all subjects involved in the study.
Institutional review board statement
The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of Universidad César Vallejo (March 21, 2022) for studies involving humans.
Data availability
The database is available on the Open Science Framework with registration number p468x: [https://osf.io/p468x/?view_only=75ceefc49c16481ca7e0fa2f4669f49a].
Declaration of generative AI in scientific writing
No generative AI was used writing this text.
Financial support
This research received no external funding; however, the article processing charges (APC) and translation costs were covered by Universidad César Vallejo.
Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Appendix.
ESCALA SSUTVE
(Versión en español).
| A continuación, encontrará situaciones que le resultan difíciles a los docentes universitarios durante la enseñanza en entornos virtuales. Indique qué tan seguro (a) se siente usted de realizar cada una de las siguientes actividades, escribiendo el número del 1 al 10, de acuerdo a la siguiente escala: 1De ninguna manera lo puedo hacer 5 Moderadamente seguro(a) de que lo puedo hacer 10 Totalmente seguro que lo puedo hacer |
| Como docente universitario (a) en entornos virtuales, confío en que puedo … | De ninguna manera lo puedo hacer |
Moderada-mente seguro(a) de que lo puedo hacer |
Totalmen-te seguro que lo puedo hacer |
|||||||
|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | |
| 1. Utilizar distintas herramientas digitales (Kahoot, Quizz, Mentimeter, etc) para mantener la participación durante la sesión. | ||||||||||
| 2. Diseñar actividades lúdicas para dinamizar las sesiones de clases. | ||||||||||
| 3. Elaborar presentaciones creativas utilizando diversas herramientas virtuales. | ||||||||||
| 4. Diseñar recursos audiovisuales (PPTs, videos, audio clips, podcast, etc.), utilizando diversos programas y/o herramientas virtuales. | ||||||||||
| 5. Utilizar paquetes de ofimática para implementar, publicar y compartir materiales (vídeos, documentos, hojas de cálculo, etc.). | ||||||||||
| 6. Seleccionar idóneamente de la internet, programas de acceso libre, audiovisuales, base de datos, etc., para el desarrollo de las sesiones. | ||||||||||
| 7. Reconocer los riesgos que se encuentran en la Internet (hackers, spyware, virus, etc.) que pueden afectar la implementación y desarrollo de mis clases. | ||||||||||
| 8. Implementar acciones (salas de espera, registro para ingreso, contraseña) para proteger mis sesiones de clase. | ||||||||||
| 9. Ejecutar medidas para bloquear las intromisiones (suspender a los participantes) durante el desarrollo de las sesiones. | ||||||||||
| 10. Publicar recursos digitales que contengan las citas de las fuentes de donde se extrajo información. | ||||||||||
| 11. Revisar el material elaborado por los estudiantes con software antiplagios | ||||||||||
| 12. Responder a las dudas de los estudiantes respecto al manejo de plataformas o herramientas digitales, en los plazos establecidos por la institución. | ||||||||||
| 13. Orientar a los estudiantes en la solución de los problemas tecnológicos, durante las sesiones. | ||||||||||
| 14. Elaborar guías o instructivos para el uso óptimo de las plataformas, herramientas o programas digitales. | ||||||||||
| 15. Diseñar secuencias didácticas que promuevan la inclusión de estudiantes con necesidades educativas especiales dentro del aula. | ||||||||||
| 16. Diseñar secuencias didácticas que movilicen capacidades en función a las competencias del curso. | ||||||||||
| 17. Estructurar el sílabo de un curso asignado, acorde al plan curricular y a la competencia profesional esperada. | ||||||||||
| 18. Preparar material actualizado, acorde al propósito de las sesiones y competencia del curso. | ||||||||||
| 19. Adaptar, si es necesario, estrategias metodológicas en función de las necesidades e intereses de aprendizaje de los estudiantes, la demanda del entorno y la competencia del curso. | ||||||||||
| 20. Dinamizar el espacio de aprendizaje (pausas activas, dinámicas, narración de experiencias, etc.) para facilitar el proceso de enseñanza - aprendizaje. | ||||||||||
| 21. Aplicar estrategias para recoger saberes y generar nuevos aprendizajes, de acuerdo al propósito de la sesión y competencia del curso. | ||||||||||
| 22. Utilizar estrategias que promuevan en los estudiantes la reflexión del proceso de aprendizaje. | ||||||||||
| 23. Brindar instrucciones precisas (búsqueda de información, procesos y resultado esperado) encaminadas a la consecución de productos académicos. | ||||||||||
| 24. Manejar con calma las discusiones/altercados que se presenten entre estudiantes, durante las sesiones. | ||||||||||
| 25. Orientar a los estudiantes en el cumplimiento de las metas de la sesión | ||||||||||
| 26. Involucrar a todos los estudiantes en las actividades de aprendizaje | ||||||||||
| 27. Usar información actualizada de la materia a impartir | ||||||||||
| 28. Participar como experto o consultor en el área (o especialidad) relacionada a la materia que enseño. | ||||||||||
| 29. Redactar notas o artículos periodísticos sobre temas de mi especialidad. | ||||||||||
| 30. Elaborar instrumentos pertinentes para evaluar el aprendizaje de los estudiantes. | ||||||||||
| 31. Adaptar instrumentos de evaluación, de acuerdo a la naturaleza del curso y del producto académico. | ||||||||||
| 32. Retroalimentar a los estudiantes sobre sus avances en los resultados de la evaluación. | ||||||||||
| 33. Retroalimentar de forma personalizada a los estudiantes con riesgo académico para que superen sus dificultades. |
Scale for University Teaching in Virtual Environments, SSUTVE
| Below, you will find scenarios that university teachers find difficult when teaching in virtual environments. Indicate how confident you feel about performing each of the following activities by writing a number from 1 to 10, according to the following scale: 1. No way can I do it. 5. Moderately confident that I can do it. 10. Absolutely sure that I can do it. |
| As a university teacher in virtual environments, I am confident that I can … | No way can I do it |
Modera-tely confident that I can do it |
Absolutely sure that I can do it |
|||||||
|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | |
| 1. Use different digital tools (Kahoot, Quizz, Mentimeter, etc.) to maintain participation during sessions. | ||||||||||
| 2. Design playful activities to energise class sessions. | ||||||||||
| 3. Develop creative presentations using a variety of virtual tools. | ||||||||||
| 4. Design audiovisual resources (PPTs, videos, audio clips, podcasts, etc.), using various programmes and/or virtual tools. | ||||||||||
| 5. Use office automation packages to implement, publish and share materials (videos, documents). | ||||||||||
| 6. Select free-access programmes, audiovisuals, databases, etc., from the Internet for session development. | ||||||||||
| 7. Recognise risks found on the Internet (hackers, spyware, viruses, etc.) that can affect the implementation and development of my classes. | ||||||||||
| 8. Implement actions (waiting rooms, login, password) to protect my class sessions. | ||||||||||
| 9. Execute measures to block intrusions (suspending participants) during the development of the sessions. | ||||||||||
| 10. Publish digital resources that cite sources from which information was obtained. | ||||||||||
| 11. Review material produced by students with anti-plagiarism software. | ||||||||||
| 12. Respond to students' doubts regarding the use of digital platforms or tools, within the deadlines established by the institution. | ||||||||||
| 13. Guide students in solving technological problems during sessions. | ||||||||||
| 14. Develop guides or instructions for the optimal use of digital platforms, tools, or programmes. | ||||||||||
| 15. Design didactic sequences that promote the inclusion of students with disabilities in the classroom. | ||||||||||
| 16. Design didactic sequences that mobilise skills according to course competences. | ||||||||||
| 17. Evaluate the syllabus of an assigned course according to the curricular plan and expected professional competences. | ||||||||||
| 18. Prepare updated material, according to the purpose of the sessions and course competences. | ||||||||||
| 19. Adapt methodological strategies as necessary, according to the learning needs and interests of the students, the demands of the environment and course competences. | ||||||||||
| 20. Dynamise the learning space (active pauses, dynamics, narration of experiences, etc.) to facilitate the teaching–learning process. | ||||||||||
| 21. Apply strategies to consolidate knowledge and generate new learning according to the purpose of the session and course competences. | ||||||||||
| 22. Use strategies that promote student reflection on the learning process. | ||||||||||
| 23. Provide precise instructions (information search, processes and expected outcomes) aimed at achieving academic products. | ||||||||||
| 24. Handle conflicts that arise among students during sessions. | ||||||||||
| 25. Direct students in achieving the goals of the session. | ||||||||||
| 26. Engage all students in learning activities. | ||||||||||
| 27. Manage up-to-date information on the subject matter to be taught. | ||||||||||
| 28. Participate as an expert or consultant in the area (or speciality) related to the subject I teach. | ||||||||||
| 29. Write journalistic notes or articles on topics of my specialty. | ||||||||||
| 30. Develop instruments to assess student learning. | ||||||||||
| 31. Adapt evaluation instruments according to the characteristics of the students and the academic product. | ||||||||||
| 32. Provide feedback to students on their progress based on assessment results. | ||||||||||
| 33. Offer personalised feedback to students at academic risk to help them overcome their difficulties. |
References
- 1.John S. The integration of information technology in higher education: a study of faculty's attitude towards IT adoption in the teaching process. Contad. Adm. 2015;60(S1):230–252. doi: 10.1016/j.cya.2015.08.004. [DOI] [Google Scholar]
- 2.Martínez-Galiano J.M., et al. Metodología basada en tecnología de la información y la comunicación para resolver los nuevos retos en la formación de los profesionales de la salud. Educ. médica. 2016;17(1):20–24. doi: 10.1016/j.edumed.2016.02.004. [DOI] [Google Scholar]
- 3.Mad Daud S.H., Ibrahim Teo N.H., Nat Zain N.H. E-JAVA chatbot for learning programming language: a post-pandemic alternative virtual tutor. Int. J. Emerg. Trends Eng. Res. 2020;8(7):3290–3298. doi: 10.30534/ijeter/2020/67872020. [DOI] [Google Scholar]
- 4.Weaver G., McDonald P., Louie G., Woodman T. Future potentials for international virtual exchange in higher education post COVID-19: a scoping review. Educ. Sci. 2024;14:232. doi: 10.3390/educsci14030232. [DOI] [Google Scholar]
- 5.Su L., Gu X. Optimization path of college network classroom in teaching practice in post-epidemic era. Advances in Transdisciplinary Engineering. 2024;48:559–567. [Google Scholar]
- 6.Putilovskaya T.S., Zubareva E.V., Tuchkova I.G. In: Proceedings of the International Scientific Conference “Smart Nations: Global Trends in the Digital Economy. Ashmarina S.I., Mantulenko V.V., Vochozka M., editors. vol. 398. Springer; Cham: 2022. Changes in the students' attitude to E-learning in the post-pandemic period. (Lecture Notes in Networks and Systems). [DOI] [Google Scholar]
- 7.Bandura A. Guide for constructing self-efficacy scales. Self-Efficacy Beliefs of Adolescents. 2006;5(1):307–337. [Google Scholar]
- 8.Schunck D., DiBenedetto M. Chapter four - self-efficacy and human motivation. Elliot A., editor. Adv Motiv Achiev. 2021;8:153–179. doi: 10.1016/bs.adms.2020.10.001. [DOI] [Google Scholar]
- 9.Bandura, Self-efficacy: toward a unifying theory of behavioral change. Psychol. Rev. 1977;84(2):191. doi: 10.1037/0033-295X.84.2.191. [DOI] [PubMed] [Google Scholar]
- 10.Bandura A., Freeman W.H., Lightsey R. Self-efficacy: the exercise of control. J. Cognit. Psychother. 1999;13(2):158–166. doi: 10.1891/0889-8391.13.2.158. [DOI] [Google Scholar]
- 11.Bandura Toward a psychology of human agency: pathways and reflections. Perspect. Psychol. Sci. 2018;13(2):130–136. doi: 10.1177/1745691617699280. [DOI] [PubMed] [Google Scholar]
- 12.Stajkovic D., Luthans F. Self-efficacy and work-related performance: a meta-analysis. Psychol. Bull. 1998;124(2):240–261. doi: 10.1037/0033-2909.124.2.240. [DOI] [Google Scholar]
- 13.Zhang R., et al. The association of self-efficacy with well-being and ill-being: the possible mediator of coping and moderator of gender. Curr. Psychol. 2023;42(32):27998–28006. doi: 10.1007/s12144-022-03791-8. [DOI] [Google Scholar]
- 14.Tschannen-Moran M., Woolfolk Hoy A. Teacher efficacy: capturing an elusive construct. Teach. Teach. Educ. 2001;17(7):783–805. doi: 10.1016/S0742-051X(01)00036-1. [DOI] [Google Scholar]
- 15.Minihan E., Adamis D., Dunleavy M., Martin A., Gavin B., McNicholas F. COVID-19 related occupational stress in teachers in Ireland. IJEDRO. 2022;3 doi: 10.1016/j.ijedro.2021.100114. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Tang Y.-C., Guo D.-Y. Technostress among elementary school teachers in kaohsiung city during emergency remote teaching. Bull. Educ. Psychol. 2023;55(2):319–344. doi: 10.6251/BEP.202312_55(2).0005. [DOI] [Google Scholar]
- 17.Varas-Meza H., Suárez-Amaya W., López-Valenzuela C., Valdés-Montesinos M. Educación virtual: factores que influyen en su expansión en América Latina. Utopía Prax. Latinoam. 2020;25(13):21–38. doi: 10.5281/zenodo.4292698. [DOI] [Google Scholar]
- 18.Canabarro R., et al. Translation and transcultural adaptation of the self-efficacy scale for teachers of students with autism: autism self-efficacy scale for teachers (ASSET) Rev. Bras. Educ. Espec. 2018;24(2):221–240. doi: 10.1590/S1413-65382418000200006. [DOI] [Google Scholar]
- 19.Lu M.-H., et al. Psychometric properties of the teachers' sense of efficacy scale for Chinese special education teachers. J. Psychoeduc. Assess. 2021;39(2):212–226. [Google Scholar]
- 20.Valls M., et al. Qualités psychométriques de la version francophone de la Teachers' Sense of Efficacy Scale (TSES-12f) Eur. Rev. Appl. Psychol. 2020;70(3) doi: 10.1016/j.erap.2020.100551. [DOI] [Google Scholar]
- 21.Alnahdi G. Rasch validation of the Arabic version of the teacher efficacy for inclusive practices (TEIP) scale. Stud. Educ. Evaluation. 2019;62:104–110. doi: 10.1016/j.stueduc.2019.05.004. [DOI] [Google Scholar]
- 22.Cardona-Molto M.C., et al. The Spanish version of the teacher efficacy for inclusive practice (TEIP) scale: adaptation and psychometric properties. Eur. J. Educ. Res. 2020;9(2):809–823. doi: 10.12973/eu-jer.9.2.809. [DOI] [Google Scholar]
- 23.Martins B.A., Chacon M.C.M. Escala de eficácia docente para práticas inclusivas: Validação da teacher efficacy for inclusive practices (teip) scale. Rev. Bras. Educ. Espec. 2020;26(1):1–16. doi: 10.1590/s1413-65382620000100001. [DOI] [Google Scholar]
- 24.Li C., et al. Psychometric properties of the physical educators' self-efficacy toward including students with disabilities— autism among Chinese preservice physical education teachers. Adapt Phys Activ Q. 2018;35(2):159–174. doi: 10.1123/apaq.2017-0086. [DOI] [PubMed] [Google Scholar]
- 25.Moen L., Sheridan S.M. Evaluation of the psychometric properties of the teacher efficacy for promoting partnership measure among a sample of head start educators. J. Psychoeduc. Assess. 2020;38(4):507–518. doi: 10.1177/0734282919863574. [DOI] [Google Scholar]
- 26.Cetinkaya A., Erbas A.K. Psychometric properties of the Turkish adaptation of the mathematics teacher efficacy belief instrument for in-service teachers. Span J Psychol. 2011;14(2):956–966. doi: 10.5209/rev_SJOP.2011.v14.n2.41. [DOI] [PubMed] [Google Scholar]
- 27.Avanzi L., et al. Cross-validation of the Norwegian teacher's self-efficacy scale (NTSES) Teach. Teach. Educ. 2013;31:69–78. doi: 10.1016/j.tate.2013.01.002. [DOI] [Google Scholar]
- 28.Dominguez-Lara S., Fernández-Arata M., Merino-Soto C., Navarro-Loli J.S., Calderón-De La Cruz G. G. Escala de Autoeficacia Docente: Análisis estructural e invarianza de medición en docentes peruanos de escuelas públicas. RACC. 2019;11(3):61–72. doi: 10.32348/1852.4206.v11.n3.24624. [DOI] [Google Scholar]
- 29.Sánchez-Rosas J., Dyzenchauz M., Dominguez-Lara S., Hayes A. Collective teacher self-efficacy scale for elementary school teachers. E-IJI.NET. 2022;15(1):985–1002. doi: 10.29333/iji.2022.15156a. [DOI] [Google Scholar]
- 30.Schmitz G.S., Schwarzer R. Selbstwirksamkeitserwartung von Lehrern: Längsschnittbefunde mit einem neuen Instrument. Zeitschrift für Padagogische Psychol. 2000;14(1):12–25. doi: 10.1024//1010-0652.14.1.12. [DOI] [Google Scholar]
- 31.Oginga K., et al. The culturally responsive classroom management self-efficacy scale: development and initial validation. Urban Educ. 2015:1–27. doi: 10.1177/0042085915602534. [DOI] [Google Scholar]
- 32.Clément L., et al. Validation canadienne-française de l’Échelle des déterminants du sentiment d’efficacité personnelle des enseignants (EDSEPE) Psychol. Travail Organ. 2018;24(2):126–143. doi: 10.1016/j.pto.2018.02.002. [DOI] [Google Scholar]
- 33.Chan D.W. General, collective, and domain-specific teacher self-efficacy among Chinese prospective and in-service teachers in Hong Kong. Teaching end Teacher Education. 2008;24:1057–1069. doi: 10.1016/j.tate.2007.11.010. [DOI] [Google Scholar]
- 34.Dybowski C., et al. Psychometric properties of the newly developed physician teaching self-efficacy questionnaire (PTSQ) BMC Med. Educ. 2016;16(1):1–11. doi: 10.1186/s12909-016-0764-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Moslemi N., Mousavi A. A psychometric re-examination of the science teaching efficacy and beliefs instrument (STEBI) in a canadian context. Educ. Sci. 2019;9(1):17. doi: 10.3390/educsci9010017. [DOI] [Google Scholar]
- 36.Slater E., et al. The validity of the science teacher efficacy belief instrument (STEBI-B) for postgraduate, pre-service, primary teachers. Heliyon. 2021;7(9) doi: 10.1016/j.heliyon.2021.e07882. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Dominguez-Lara S. Construcción de una escala de autoeficacia para la investigación: primeras evidencias de validez. RIDU. 2017;11(2):309–322. doi: 10.19083/ridu.11.514. [DOI] [Google Scholar]
- 38.Paetsch J., Franz S., Wolter I. Changes in early career teachers' technology use for teaching: the roles of teacher self-efficacy, ICT literacy, and experience during COVID-19 school closure. Teach. Teach. Educ. 2023;135 doi: 10.1016/j.tate.2023.104318. [DOI] [Google Scholar]
- 39.Mâță L., Clipa O., Tzafilkou K. The development and validation of a scale to measure university teachers' attitude towards ethical use of information technology for a sustainable education. Sustainability. 2020;12(15):6268. doi: 10.3390/su12156268. [DOI] [Google Scholar]
- 40.Tondeur J., et al. Developing a validated instrument to measure preservice teachers' ICT competencies: meeting the demands of the 21st century. Br. J. Educ. Technol. 2017;48(2):462–472. https://eric.ed.gov/?id=EJ1130843 [Google Scholar]
- 41.Bandura Social cognitive theory of self-regulation. Organ. Behav. Hum. Decis. Process. 1991;50(2):248–287. [Google Scholar]
- 42.Bandura Social cognitive theory: an agentic perspective. Psychology: The Journal of the Hellenic Psychological Society. 2020;12(3):313–333. doi: 10.12681/psy_hps.23964. [DOI] [Google Scholar]
- 43.Bandura . In: Encyclopedia of Human Behavior. Ramachaudran V.S., editor. Academic Press; New York: 1994. Self-efficacy; pp. 71–81. [Google Scholar]
- 44.Bonsaksen T., Steigen A.M., Stea T.H., Kleppang A.L., Lien L., Leonhardt M. Negative social media-related experiences and lower general self-efficacy are associated with depressive symptoms in adolescents. Front. Public Health. 2023:10. doi: 10.3389/fpubh.2022.1037375. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Luszczynska, Gutiérrez-Doña B., Schwarzer R. General self‐efficacy in various domains of human functioning: evidence from five countries. Int. J. Psychol. 2005;40(2):80–89. doi: 10.1080/00207590444000041. [DOI] [Google Scholar]
- 46.Scholz U., Doña B.G., Sud S., Schwarzer R. Is general self-efficacy a universal construct? EJPA. 2006;18(3) doi: 10.1027//1015-5759.18.3.242. [DOI] [Google Scholar]
- 47.Woodcock S., Hitches E., Manning A. ‘The hardest part is…’: teacher self-efficacy and inclusive practice. IJEDRO. 2023;5 doi: 10.1016/j.ijedro.2023.100289. [DOI] [Google Scholar]
- 48.Dai W. An empirical study on English preservice teachers' digital competence regarding ICT self-efficacy, collegial collaboration and infrastructural support. Heliyon. 2023;9(9) doi: 10.1016/j.heliyon.2023.e19538. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Jiang H., Chugh R., Zhai X., Wang K., Wang X. Longitudinal analysis of teacher self-efficacy evolution during a STEAM professional development program: a qualitative case study. Humanit Soc Sci Commun. 2024;11:1162. doi: 10.1057/s41599-024-03655-5. 2024. [DOI] [Google Scholar]
- 50.Johnson J. Teacher self-efficacy and teacher work engagement for expats at international k12 schools in China: a correlation analysis. Int J Educ Res Open. 2022;3 doi: 10.1016/j.ijedro.2022.100176. [DOI] [Google Scholar]
- 51.Finch J.E., Akhavein K., Patwardhan I., Clark C.A.C. Teachers' self-efficacy and perceptions of school climate are uniquely associated with students' externalizing and internalizing behavior problems. J. Appl. Dev. 2023;85(101512) doi: 10.1016/j.appdev.2023.101512. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Kengatharan N., Gnanarajan A.H. Teacher self-efficacy and student misbehaviour: the moderating role of gender–classroom management. Int. J. Educ. Manag. 2023;37(2):507–525. doi: 10.1108/ijem-04-2022-0141. [DOI] [Google Scholar]
- 53.Lazarides R., Schiefele U. Addressing the reciprocal nature of effects in teacher motivation research: a study on relations among teacher motivation, student-reported teaching, and student enjoyment and achievement. Learn. Instr. 2024;90(101862) doi: 10.1016/j.learninstruc.2023.101862. [DOI] [Google Scholar]
- 54.K E., Emiru, Gedefaw M.T. The effect of teacher self-efficacy on learning engagement of secondary school students. Cogent Educ. 2024;11(1) doi: 10.1080/2331186x.2024.2308432. [DOI] [Google Scholar]
- 55.Jerrim J., Sims S., Oliver M. Teacher self-efficacy and pupil achievement: much ado about nothing? International evidence from TIMSS. Teach. Teach. 2022;29(2):507–525. doi: 10.1080/13540602.2022.2159365. [DOI] [Google Scholar]
- 56.Chaabane S., Chaabna K., Khawaja S., Aboughanem J., Mittal D., Mamtani R., Cheema S. Sleep disorders and associated factors among medical students in the Middle East and North Africa: a systematic review and meta-analysis. Sci. Rep. 2024;14:4656. doi: 10.1038/s41598-024-53818-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Duko A. Bedaso, Assefa Dachew B., Newnham E., Tesfay Gebremedhin A., Tessema G., Einarsdottir K., Alati R., Pereira G. The effect of maternal prenatal tobacco smoking on offspring academic achievement: a systematic review and meta-analysis. Addict. Behav. 2024;153 doi: 10.1016/j.addbeh.2024.107985. [DOI] [PubMed] [Google Scholar]
- 58.Kocsis Á., Molnár G. Factors influencing academic performance and dropout rates in higher education. Oxf. Rev. Educ. 2024:1–19. doi: 10.1080/03054985.2024.2316616. [DOI] [Google Scholar]
- 59.Li J., Xue E., Li C., He Y. Investigating latent interactions between students' affective cognition and learning performance: meta-analysis of affective and cognitive factors. Behav. Sci. 2023;13(7):555. doi: 10.3390/bs13070555. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Ozyildirim G., Karadağ E. The effect of peer bullying on academic achievement: a meta‐analysis study related to results of TIMSS and PIRLS. Psychol. Sch. 2024 doi: 10.1002/pits.23159. [DOI] [Google Scholar]
- 61.Wang F., Ni X., Zhang M., Zhang J. Educational digital inequality: a meta-analysis of the relationship between digital device use and academic performance in adolescents. Comput. Educ. 2024;213 doi: 10.1016/j.compedu.2024.105003. [DOI] [Google Scholar]
- 62.Girelli L., Alivernini F., Luicidi F., Cozzolino M., Savarese G., Sibilio M., Salvatore S. Autonomy supportive contexts, autonomous motivation, and self-efficacy predict academic adjustment of first-year university students. Front. Educ. 2018;3:1–11. doi: 10.3389/feduc.2018.00095. article 95. [DOI] [Google Scholar]
- 63.Rodari Meisner J., McKenzie J.M. Teacher perceptions of self-efficacy in teaching online during the COVID-19 pandemic. Athens J. Educ. 2023;10(1):49–66. doi: 10.30958/aje.10-1-3. [DOI] [Google Scholar]
- 64.Zeng Y., Wang Y., Li S. The relationship between teachers' information technology integration self-efficacy and TPACK: a meta-analysis. Front. Psychol. 2022;13 doi: 10.3389/fpsyg.2022.1091017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Davis F.D. Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS Q. 1989;13(3):319–339. [Google Scholar]
- 66.Wang C.-H., Cardullo V., Burton M., Salisbury-Glennon J., Serafini A. Teaching online during COVID-19: teacher self-efficacy and the extended technology acceptance model. Journal of Educators Online. 2023;20(2) doi: 10.3102/1679196. [DOI] [Google Scholar]
- 67.Joo Y.J., Park S., Lim E. Factors influencing preservice teachers' intention to use technology: TPACK, teacher self-efficacy, and Technology Acceptance Model. Educ. Technol. Soc. 2018;21(3):48–59. [Google Scholar]
- 68.Natasia S.R., Wiranti Y.T., Parastika A. Acceptance analysis of NUADU as e-learning platform using the Technology Acceptance Model (TAM) approach. Procedia Comput. Sci. 2022;197:512–520. doi: 10.1016/j.procs.2021.12.168. [DOI] [Google Scholar]
- 69.Srivastava M., Kumar V.V., Mehrotra R. Examining the psychological role of self-efficacy in teachers' adaption to remote learning during the covid-19 pandemic. JRTDD. 2023;6(7):238–247. https://jrtdd.com/index.php/journal/article/view/787 [Google Scholar]
- 70.Yin H., Han, J J., Perron B.E., Why B.E. Are Chinese university teachers (not) confident in their competence to teach? The relationships between faculty-perceived stress and self-efficacy. Int. J. Educ. Res. 2020;100 doi: 10.1016/j.ijer.2019.101529. [DOI] [Google Scholar]
- 71.Bonilla-García M.A., López Suárez A.D. Ejemplificación del proceso metodológico de la teoría fundamentada. Cinta Moebio. 2016;57:305–315. doi: 10.4067/S0717-554X2016000300006. [DOI] [Google Scholar]
- 72.Auziņa Teacher competences for facing challenges of globalisation in education. J. Educ. Cult. Soc. 2018;2:24–37. doi: 10.15503/JECS20182.24.37. [DOI] [Google Scholar]
- 73.Ramírez-García, et al. Las Competencias Docentes Genéricas en los Grados de Educación. Visión del Profesorado Universitario. Estud. Pedagog. 2018;44(2):259–277. doi: 10.4067/S0718-07052018000200259. [DOI] [Google Scholar]
- 74.Acevedo-Duque Á., et al. Competencias del docente en educación online en tiempo de COVID-19: Universidades Públicas de Honduras. Rev. Cien. Soc. 2020;26(2):206–224. https://www.redalyc.org/jatsRepo/280/28064146014/html/index.html [Google Scholar]
- 75.Núñez-Canal M., et al. New challenges in higher education: a study of the digital competence of educators in Covid times. Technol. Forecast. Soc. Change. 2022;174 doi: 10.1016/j.techfore.2021.121270. [DOI] [Google Scholar]
- 76.Villarroel V.A., Bruna D.V. Competencias pedagógicas que caracterizan a un docente universitario de excelencia: Un estudio de caso que incorpora la perspectiva de docentes y estudiantes. Form. Univ. 2017;10(4):75–96. doi: 10.4067/S0718-50062017000400008. [DOI] [Google Scholar]
- 77.Baartman L.K.J., Bastiaens T.J., Kirschner P.A., van der Vleuten C.P.M. Evaluating assessment quality in competence-based education: a qualitative comparison of two frameworks. Educ. Res. Rev. 2007;2(2):114–129. doi: 10.1016/j.edurev.2007.06.001. [DOI] [Google Scholar]
- 78.Usart Rodríguez M., et al. Validación de una herramienta para autoevaluar la competencia digital docente. Educ. XX1. 2021;24(1):353–373. doi: 10.5944/educXX1.27080. [DOI] [Google Scholar]
- 79.Çebi, Özdemir T.B., Reisoğlu İ., Çolak C. From digital competences to technology integration: Re-formation of pre-service teachers' knowledge and understanding. Int. J. Educ. Res. 2022;113 doi: 10.1016/j.ijer.2022.101965. [DOI] [Google Scholar]
- 80.Cabero-Almenara J., Barroso-Osuna J., Rodríguez-Gallego M., Palacios-Rodríguez A. La Competencia Digital Docente. El caso de las universidades andaluzas. Aula Abierta. 2020;49(4):63–372. doi: 10.17811/rifie.49.4.2020.363-372. [DOI] [Google Scholar]
- 81.García-Cabrero, Serrano E.L., Cisneros-Cohernour E., Arroyo G.C., Vigil M.H.G. Las competencias docentes en entornos virtuales: un modelo para su evaluación [Teaching competences in virtual environments: a model for their evaluation] RIED. Rev. Iberoam. Educ. Distancia. 2018;21(1):343–365. doi: 10.5944/ried.21.1.18816. [DOI] [Google Scholar]
- 82.Chávez-Ventura G., Santa-Cruz-Espinoza H., Merino-Soto C., Osorio-Guzmán M., Jaime-Salas J., Risueño A. Estudio Intercultural de una Batería Sociocognitiva de Autoeficacia Vocacional. Rev. Iberoam. 2020;1(54):131–145. doi: 10.21865/RIDEP54.1.11. [DOI] [Google Scholar]
- 83.Chávez-Ventura G., Osorio-Guzmán M., Santa-Cruz-Espinoza H., Prado-Romero C. Estructura Interna de la Batería Sociocognitiva de Autoeficacia Vocacional [Internal Structure of the Socio-Cognitive of the Vocational Self-Efficacy Battery] Rev. Iberoam. 2022;1(67):59–74. doi: 10.21865/RIDEP67.1.05. [DOI] [Google Scholar]
- 84.de la Espriella R., Gómez Restrepo C. Teoría fundamentada. Rev. Colomb. Psiquiatr. 2020;49(2):127–133. doi: 10.1016/j.rcp.2018.08.002. http://www.scielo.org.co/pdf/rcp/v49n2/0034-7450-rcp-49-02-127.pdf [DOI] [PubMed] [Google Scholar]
- 85.Cohen M.L., Kisala P.A., Dyson-Hudson T.A., Tulsy D.S. Measuring pain phenomena after spinal cord injury: development and psychometric properties of the SCI-QOL Pain Interference and Pain Behavior assessment tools. The Journal of Spinal Cord Medicine. 2018;41(3):267–280. doi: 10.1080/10790268.2017.1279805. 10.1080/10790268.2017.1279805. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Jahromi M.F., Dehghani A. Development and validation of the social responsibility questionnaire for Iranian nurses. Nursing and Midwifery Studies. 2020;9(3):171–177. doi: 10.4103/nms.nms_40_19. [DOI] [Google Scholar]
- 87.Zarei F., Solhi M., Merghati-Khoei E., Taghdisi M.H., Shojaeizadeh D., Taket A.R., et al. Development and psychometric properties of social exclusion questionnaire for Iranian divorced women. Iran. J. Public Health. 2017;46(5):640–649. [PMC free article] [PubMed] [Google Scholar]
- 88.Pedrosa, Suárez-Álvarez J., García-Cueto E. Evidencias sobre la Validez de Contenido: Avances Teóricos y Métodos para su Estimación [Content Validity Evidences: Theoretical Advances and Estimation Methods] Acción Psicol. 2013;10(2):3–20. doi: 10.5944/ap.10.2.11820. [DOI] [Google Scholar]
- 89.Urrutia M., et al. Métodos óptimos para determinar validez de contenido. Educ Med Super. 2014;28(3):547–558. http://scielo.sld.cu/pdf/ems/v28n3/ems14314.pdf [Google Scholar]
- 90.Horne K., Chen T., Irish M. Development of the Flexibility in Daily Life scale to measure multidimensional cognitive and behavioural flexibility in health and disease. Br. J. Clin. Psychol. 2024:1–15. doi: 10.1111/bjc.12505. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Martin M., Rubin R. A new measure of cognitive flexibility. Psychol. Rep. 1995;76(2):623–626. doi: 10.2466/pr0.1995.76.2.623. [DOI] [Google Scholar]
- 92.Collins M., Warburton W.A., Bussey K., Sweller N. Factor structure and psychometric properties of the digital communication empathy scale (DCES) Int. J. Hum. Comput. Stud. 2024;183 doi: 10.1016/j.ijhcs.2023.103183. [DOI] [Google Scholar]
- 93.Grainger S.A., McKay K.T., Riches J.C., Chander R.J., Cleary R., Mather K.A., Kochan N.A., Sachdev P.S., Henry J.D. Measuring empathy across the adult lifespan: a comparison of three assessment types. Assessment. 2023;30(6):1870–1883. doi: 10.1177/10731911221127902. [DOI] [PubMed] [Google Scholar]
- 94.Ho M.Y., Liang S., Hook J.N. Development and validation of the forbearance scale. J. Pers. Asses. 2023;105(6):779–788. doi: 10.1080/00223891.2022.2153691. [DOI] [PubMed] [Google Scholar]
- 95.Matos D.A.S., Leite W.L., Brown G.T.L., Cirino S.D. An analysis of the factorial structure of the Teacher Communication Behavior Questionnaire with Brazilian high school science students. Psicol. Teor. Pesqui. 2014;30(2):223–234. doi: 10.1590/S0102-37722014000200012. [DOI] [Google Scholar]
- 96.Amador-Campos J.A., et al. Mentoring and research self-efficacy of doctoral students: a psychometric approach. Educ. Sci. 2023;13(4) doi: 10.3390/educsci13040358. [DOI] [Google Scholar]
- 97.Aldalur, Perez A. Gamification and discovery learning: motivating and involving students in the learning process. Heliyon. 2023;9(1) doi: 10.1016/j.heliyon.2023.e13135. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Murillo-Zamorano L.R., et al. Gamification in higher education: the ECOn+ star battles. Comput Ed. 2023;194 doi: 10.1016/j.compedu.2022.104699. [DOI] [Google Scholar]
- 99.Parra Castrillón E. Análisis sobre comportamientos éticos en la educación virtual. Revista Interamericana de Investigación Educación y Pedagogía RIIEP. 2021;14(2):113–140. doi: 10.15332/25005421.6059. [DOI] [Google Scholar]
- 100.Cruz Rodriguez E.D.C. Importancia del manejo de competencias tecnológicas en las prácticas docentes de la Universidad Nacional Experimental de la Seguridad (UNES) Rev. Educ. 2018;43(1):196–218. doi: 10.15517/revedu.v43i1.27120. [DOI] [Google Scholar]
- 101.Prieto-Andreu, Gómez-Escalonilla-Torrijos J.D., Said-Hung E. Gamificación, motivación y rendimiento en educación: Una revisión sistemática. Rev. Electrón. Educ. 2022;26(1):1–23. doi: 10.15359/ree.26-1.14. [DOI] [Google Scholar]
- 102.Fortuna M., et al. Does gamification mediate the relationship between digital social capital and student Performance? A survey-based study in Spain. Int. J. Manag. Educ. 2023;21(3) doi: 10.1016/j.ijme.2023.100846. [DOI] [Google Scholar]
- 103.Addimando, distance learning in pandemic age: lessons from a (No longer) emergency. Int. J. Environ. Res. Public Health. 2022;19 doi: 10.3390/ijerph192316302. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 104.De la Rosa Rodríguez P.I. Aplicaciones educativas digitales y la falta de seguridad de los datos personales de sus usuarios. RIDE. Revista Iberoamericana para la Investigación y el Desarrollo Educativo. 2021;12(23):e02. doi: 10.23913/ride.v12i23.980. [DOI] [Google Scholar]
- 105.Zapata-Ros Gestión del aprendizaje y web social en la Educación Superior en línea. Revista de Educación a Distancia. 2018;57(7):1–34. doi: 10.6018/red/57/7. [DOI] [Google Scholar]
- 106.Merellano-Navarro E., et al. Resignificando el saber pedagógico: una mirada desde la práctica docente. Educ. Pesqui. 2019;45 doi: 10.1590/S1678-4634201945192146. [DOI] [Google Scholar]
- 107.Diaz Carriazo, et al. Planificación educativa como herramienta fundamental para una educación con calidad. Utopía Prax. Latinoam. 2020;25(3):87–95. doi: 10.5281/zenodo.3907048. [DOI] [Google Scholar]
- 108.Komariah, et al. Developing an educational and cognitive competence model for future teacher's for independent work – the case of Indonesia. Int. J. Instr. 2023;16(3):149–170. doi: 10.29333/iji.2023.1639a. [DOI] [Google Scholar]
- 109.Casasola Rivera W. El papel de la didáctica en los procesos de enseñanza y aprendizaje universitarios. Rev. Comun. 2020;29(1):38–51. doi: 10.18845/rc.v29i1-2020.5258. [DOI] [Google Scholar]
- 110.Mâţă L., Suciu A.I. Curricular innovative model focused on developing pedagogical competences of teachers of language and communication. Procedia Soc. Behav. Sci. 2011;12:274–282. doi: 10.1016/j.sbspro.2011.02.036. [DOI] [Google Scholar]
- 111.Hernández-Infante R.C., Infante-Miranda M.E. La clase en la educación superior, forma organizativa esencial en el proceso de enseñanza-aprendizaje. Educación y Educadores. 2017;20(1):27–40. doi: 10.5294/edu.2017.20.1.2. [DOI] [Google Scholar]
- 112.Alonso Martin P. El perfil del buen docente universitario desde una perspectiva del alumnado. Educ. Pesqui. 2019;45 doi: 10.1590/S1678-4634201945196029. [DOI] [Google Scholar]
- 113.Løkse, Låg T., Solberg M., Andreassen H.N., Stenersen M. Elsevier Ltd.; 2017. Teaching Information Literacy in Higher Education Effective, Teaching and Active Learning; pp. 81–145. [DOI] [Google Scholar]
- 114.Sireci S., Faulkner-Bond M. Validity evidence based on test content. Psicothema. 2014;26(1):100–107. doi: 10.7334/psicothema2013.256. [DOI] [PubMed] [Google Scholar]
- 115.Muñiz J., Fonseca-Pedrero E. Diez pasos para la construcción de un test. Psicothema. 2019;31(1):7–16. doi: 10.7334/psicothema2018.291. [DOI] [PubMed] [Google Scholar]
- 116.Merino-Soto Percepción de la claridad de los ítems: Comparación del juicio de estudiantes y jueces-expertos. Revista Latinoamericana de Ciencias Sociales, Ninez y Juventud. 2016;14(2):1469–1477. doi: 10.11600/1692715x.14239120615. [DOI] [Google Scholar]
- 117.Escurra Mayaute L.M. Cuantificación de la validez de contenido por criterio de jueces. Revista de Psicología. 1988;6(1–2):103–111. doi: 10.18800/psico.198801-02.008. [DOI] [Google Scholar]
- 118.Penfield R., Giacobbi P. Applying a score confidence interval to Aiken's item content-relevance index. Meas. Phys. Educ. Exerc. Sci. 2004;8(4):213–225. doi: 10.1207/s15327841mpee0804_3. [DOI] [Google Scholar]
- 119.Merino-Soto M., Fernández-Arata M. Ítem único de burnout en estudiantes de educación superior: Estudio de validez de contenido. Educ Medica. 2017;18(3):195–198. doi: 10.1016/j.edumed.2016.06.019. [DOI] [Google Scholar]
- 120.Ato M., et al. Un sistema de clasificación de los diseños de investigación en Psicología. An. Psicol. 2013;29(3):1038–1059. doi: 10.6018/analesps.29.3.178511. [DOI] [Google Scholar]
- 121.Muthén L.K., Muthén B.O. 7 ed. Muthén & Muthén; Los Angeles, CA: 1998-2015. Mplus User's Guide. Los Angeles, CA. [Google Scholar]
- 122.Ferrando J., Anguiano-Carrasco C. Factor analysis as a research technique in psychology. Papeles del Psicólogo. 2010;31(1):18–33. [Google Scholar]
- 123.Mardia K. Measures of multivariate skewness and kurtosis with applications. Biometrika. 1970;57:519–530. doi: 10.2307/2334770. [DOI] [Google Scholar]
- 124.Ayán M.N.R., Díaz M.Á.R. Atenuación de la asimetría y de la curtosis de las puntuaciones observadas mediante transformaciones de variables: Incidencia sobre la estructura factorial [The reduction of skewness and kurtosis of observed variables by data transformation: Effect on factor structure] Psicologica. 2008;29(2):205–227. [Google Scholar]
- 125.McDonald R.P., Ho M.-H.R. Principles and practice in reporting structural equation analyses. Psychol. Methods. 2002;7(1):64–82. doi: 10.1037/1082-989x.7.1.64. [DOI] [PubMed] [Google Scholar]
- 126.Bentler Comparative fit indexes in structural models. Psychol. Bull. 1990;107(2):238–246. doi: 10.1037/0033-2909.107.2.238. [DOI] [PubMed] [Google Scholar]
- 127.DiStefano, et al. Examination of the weighted root mean square residual: evidence for trustworthiness? Struct. Equ. Modeling. 2018;25(3):453–466. doi: 10.1080/10705511.2017.1390394. [DOI] [Google Scholar]
- 128.Moral de la Rubia J. Revisión de los criterios para validez convergente estimada a través de la Varianza Media Extraída. Psychologia. 2019;13(2):25–41. doi: 10.21500/19002386.4119. [DOI] [Google Scholar]
- 129.Dominguez-Lara S. Propuesta de puntos de corte para cargas factoriales: una perspectiva de fiabilidad de constructo. Enferm. clín. 2018;28(6):401–402. doi: 10.1016/j.enfcli.2018.06.002. [DOI] [PubMed] [Google Scholar]
- 130.Henseler J., Ringle C.M., Sarstedt M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Market. Sci. 2015;43(1):115–135. doi: 10.1007/s11747-014-0403-8. [DOI] [Google Scholar]
- 131.Fornell D. F. Larcker. Evaluating structural equation models with unobservable variables and measurement error. J. Market. Res. 1981;18(1):39–50. doi: 10.1177/002224378101800104. [DOI] [Google Scholar]
- 132.Voorhees M., Brady M.K., Calantone R., Ramirez E. Discriminant validity testing in marketing: an analysis, causes for concern, and proposed remedies. J. Acad. Mark. Sci. 2015;44(1):119–134. doi: 10.1007/s11747-015-0455-4. [DOI] [Google Scholar]
- 133.Ponterotto J., Charter R. Statistical extensions of Ponterotto and Ruckdeschel's (2007) reliability matrix for estimating the adequacy of internal consistency coefficients. Percept Motor Skills. 2009;108(3):878–886. doi: 10.2466/PMS.108.3.878-886. [DOI] [PubMed] [Google Scholar]
- 134.Hunsley J., Marsh E.J. In: A Guide to Assessments that Work. Hunsley J., Marsh E.J., editors. Oxford University Press; United Kingdom: 2008. Developing criteria for evidence-based assessment: an introduction to assessment that work; pp. 3–14. [DOI] [Google Scholar]
- 135.Bolaños Jurado J.P. La gamificación como herramienta para la enseñanza y aprendizaje. Horizontes. Revista de Investigación en Ciencias de la Educación. 2023;7(30):1846–1853. doi: 10.33996/revistahorizontes.v7i30.633. [DOI] [Google Scholar]
- 136.Creswell J., Creswell J.D. fifth ed. Sage; 2018. Research Design: Qualitative, Quantitative, and Mixed Methods Approach. [Google Scholar]
- 137.Ghiyasvandian S., Matourypour P. Dimensioning the instrumentation: exploratory or confirmatory factor analysis? Rev. Bras. Enferm. 2017;70:233–234. doi: 10.1590/0034-7167-2016-0183. [DOI] [PubMed] [Google Scholar]
- 138.Grimaccia A. Naccarato. Confirmatory factor analysis to validate a new measure of food insecurity: perceived and actual constructs. Qual. Quant. 2020;54:1211–1232. doi: 10.1007/s11135-020-00982-y. [DOI] [Google Scholar]
- 139.Reise S.P., Bonifay W.E., Haviland M.G. Scoring and modeling psychological measures in the presence of multidimensionality. J. Pers. Assess. 2013;95:129–140. doi: 10.1080/00223891.2012.725437. [DOI] [PubMed] [Google Scholar]
- 140.Dominguez-Lara S., Merino-Soto C. Cognitive Emotional Regulation Questionnaire-18 en universitarios: Evidencias de validez convergente y discriminante. Rev. Iberoam. 2018;2(47):171–184. doi: 10.21865/RIDEP47.2.12. [DOI] [Google Scholar]
- 141.Lee Y.-J., Davis R. Preservice teachers' self-efficacy through hybrid field practicum in a Korean teacher education program. Int. J. InStruct. 2023;16(4):349–366. doi: 10.29333/iji.2023.16421a. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The database is available on the Open Science Framework with registration number p468x: [https://osf.io/p468x/?view_only=75ceefc49c16481ca7e0fa2f4669f49a].
