Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2020 Nov 17;26(3):2879–2896. doi: 10.1007/s10639-020-10387-x

Pre-service teacher’s self-perception of digital literacy: The case of Israel

Yehuda Peled 1,
PMCID: PMC7669487  PMID: 33223926

Abstract

Teachers provide society with literacy needs. They instruct students to acquire the essential skills and competencies required for a successful social integration. Thus, the need to identify digital readiness in teachers. The purpose of this study is to assess the level of digital literacies and digital readiness of students majoring in education. The research method includes a questionnaire comprising 54 items. The sample consists of 1265 students. The results show that more than half of the participants report an overall high level of literacy in all areas. Their sense of readiness for teamwork and their ethical readiness is high. Nonetheless, a low sense of readiness is found in a first and advanced order of readiness. The practical implications of these findings are crucial, as they can assist faculty and educational policymakers identify the strengths and weaknesses of students’ digital literacies.

Keywords: Digital literacies, Digital readiness, Students majoring in education, Higher education

Introduction

Today’s information and communications technology (ICT) systems constitute part of our daily life and workplaces (Bresnahan and Yin 2017). One’s effective involvement in society depends on one’s digital skills, which are correlated to one’s educational level (Peromingo and Pieterson 2018). Both education and workplace requirements’ entail applicable technological knowledge. Workers are constantly required to use relevant technology and update their digital skills (Peromingo and Pieterson 2018). Digital Competency brings forth changes and challenges to educators, to their required skills, learning forms, and educational environment. As facilitators and mediators of knowledge and skills, their duty is not only to be competent in subject knowledge transfer, but to prepare their students for twenty-first Century competencies (Madalińska-Michalak et al. 2018): for digital literacy and readiness (Godbey 2018).

The research literature on Digital Literacy (DL) is vast. DL is defined as ‘the confident, critical and creative use of ICT to achieve goals related to work, employability, learning, leisure, inclusion and/or participation in society’ (Ala-Mutka 2011). As a recent study explains, twenty-first century educators persistently and increasingly stress the importance of ICT literacy (a synthesis of information literacy [information], internet literacy [communication], and computer literacy [technology]), and inquire into how it can be formally and informally acquired to facilitate students’ effective integration in today’s highly technologically dependent society (Lau and Yuen 2014). This notwithstanding, there is a research gap on pre-service teacher’s digital literacy perception, usage, and posterior readiness to include digital environments in the transfer of subject knowledge.

Filling this research gap is crucial, as it can have direct consequences for related policy approaches and subsequent measurements. In fact, some researchers hold that being digitally literate is crucial to acquire other key competencies such as language, mathematics, learning to learn, cultural awareness, etc., all of which ensure that modern citizens participate actively in society and economy (Ala-Mutka 2011).

This study fills the above-mentioned gap. It presents a new digital literacy scale, the Seven Domain of Digital Literacy (SDDL), and discusses the Digital Readiness of pre-service, undergraduate students to include Digital environments in their professional duties. An additional objective of this study is to contribute to the research literature, both theoretically and practically.

The present study’s model (SDDL) and its conclusions contribute to existing theory and practice in the present domain. This is a tool which can help determine how teachers should be educated in undergraduate programs. Assessing and knowing beforehand the Digital Literacy Levels and Readiness of undergraduates can help improve instructors in their educational curricula.

Theoretical framework

Defining digital literacy

The scholarly research on educationally contextualized literacies develops constantly (Oliveira et al. 2019), as literacies’ functions and forms are determined by the constant change of society and its technology (Leu et al. 2017). The old meaning of literacy, i.e., the ability to read and write to meet society’s needed standards and expectations (McArthur et al. 2018), has become obsolete. The state-of-the-art literature understands the general concept of literacy as the knowledge and the skills needed for contemporary socio-cultural interactions, which comprehend digital (e.g., touch screen tablets) non-digital tools (e.g., paper books) (Leu et al. 2017). In other words, being digitally literate still presupposes the old literate skills associated with reading, e.g.: understanding a printed text (McArthur et al. 2018). The different information and communication technologies (ICT) have added another layer to the twenty-first century literacy requirements, as these are not only comprehended as a set of needed skills, but also as a set of technologically mediated practices within society (McArthur et al. 2018).

Thus, digital literacy, a term which emerged in the 90’s and was popularized by Paul Gilster (1997) (McArthur et al. 2018), refers on the one hand, to a set of skills, attitudes and knowledge needed to access digital information effectively, efficiently, and ethically (Julien 2018). On the other hand, it stresses the digital tools available to communicate with others, to create meaning, and to evaluate digital content (Neumann et al. 2017). Nonetheless, researchers tend to disagree as to the most important digital skills needed to be in command of today’s digitally needed proficiency. Some educational researchers identify digital literacy by categorizing its skills into information access, online participation, computer ability, search engine’s skills, and skills required to evaluate found information (McArthur et al. 2018). Others divide the digital skills into Operational, Mobile, Navigation, Social, and Creative domains (Peromingo and Pieterson 2018). The definition of Digital Literacy employed for the purposes of this study is: ‘the confident, critical and creative use of ICT to achieve goals related to work, employability, learning, leisure, inclusion and/or participation in society’ (Ala-Mutka 2011).

Defining digital readiness

ICT readiness describes the preparedness of people for using the digital environment, basically concerning its learning and studying purposes (Becker 2018). It involves the self-perception of technologically related skills, attitudes, competencies and knowledge intended to meet the expectations related to specific contexts (Hong and Kim 2018). In other words, it comprehends active participation, the application of digital media, and the overcoming of old studying and learning patterns.

Teachers’ professional development in the digital age

Teachers’ key role regarding student’s achievements in the use of technology and technology competence constitute an essential requisite for an effective teacher profession (Drossel and Eickelmann 2017; Instefjord and Munthe 2016). Urged by educational reforms, educators are under constant pressure to improve, innovate, and display higher skills before their students (Priestley 2011), including the use of technology in the teaching context (Gudmundsdottir and Hatlevik 2018). Educators around the world and in Israel are trying to adapt their educational systems to the changes that characterize current societies (Tsybulsky and Levin 2017; Sjöberg 2018). Accordingly, the constant search for effective means of achieving teachers’ ongoing professional development has become a global concern (Bautista and Oretga-Ruiz 2017). Scholars and policymakers increasingly focus on identifying and cultivating teachers, who possess the ability to act as leaders concerning their own self-learning, as well as concerning their ability to educate others with the aim of leveraging their abilities to guide the learning processes of their colleagues (Katzenmeyer and Moller 2009).

Teachers’ professional development in Israel

Israel is a country of immigrants. Following the second world war, the Jewish population has increased from 716,000 to 7 million (Israel Central Bureau of Statistics 2017), of which there are approximately 2 million Arabic speakers. The influx of immigrants continues till today. The mother tongue of 50 % of the Israeli population is not Hebrew (Hebrew is the official language). This has implications for those who attend tertiary education, as studies are usually conducted in Hebrew. Part of the online research is conducted in English and in other languages, thus suiting the student preferences. The seven domains of digital literacy (SDDL), the measuring tool of this study, has been tested in Israel and in several other countries such as the USA (Shannon 2017), Japan, the Philippines, Korea, South Africa, Croatia, Indonesia, Kenya and Qatar (Peled, Un-published work) with similar results. Hence, it can be concluded that language does not play an influential role here.

Israel’s Ministry of Education has launched a national computerization program for adapting the education system to the twenty-first century (Ministry of Education 2014). The program promotes ICT integration in schools. Its purpose is to turn these into computerized organizations. The program stresses the implementation of innovative pedagogies as well as the development of DL. Israeli colleges of education train its student-teachers to teach their students twenty-first century skills (Naifeld and Simon 2017). Thus, media and digital literacy education is fundamentally implicated in the practice of the Israeli K-12 education (Alt and Raichel 2018). Moreover, as the research literature makes it clear, instructing teachers’ trainees to teach digital skills is a challenge. For example, Davidson and Glassner (2016) inquire how can teachers be trained to advance life competencies and skills. Shamir-Inbal and Blau (2016) report that course tasks which are intended to develop digital literacy skills do not help students develop them. In this context, the results reported here, point to a consensus among the learners concerning the added value of collaboration in learning processes and outcomes. According to our research findings, digital platforms support a successful collaboration, which demands a further development of socio-emotional thinking skills. The ICT contexts provide a platform for sharing information, thoughts and comments concerning learning outcomes created by peers. They also constitute a forum for writing texts, which serve as extensions of pre-existing course materials. This suggests that there is a need to acquire new social norms concerning online interactions.

Although some claim that Israeli teachers have a low level of digital literacy (Aram and Sverdlov 2017), there are reports of a continuous change. This can be attested in Israeli schools in the increment of teachers’ understanding of their role in implementing additional skills to pure knowledge. The foregoing is a slow and tedious process (Blau et al. 2016; Redmond and Peled 2018).

Considering the above, we present a conceptual framework which assesses pre-service educators’ digital readiness and digital literacy, as this is crucial to increase educators’ awareness and digital competences for facilitating accurate digital literacy to future generations. Our study is based on the Seven Domains of Digital Literacy model (SDDL), which was developed and validated by Kurtz and Peled (2016a), which enables the identification of levels of digital readiness and competence.

The seven domains of digital literacy

The seven domains of digital literacy (SDDL), which were assembled and tested by Kurtz and Peled (2016b) are: Information collection, information evaluation, information management, information processing, teamwork, integrity awareness, and social responsibility. These domains represent the basis for one’s ability and preparedness to manage complex digital environments (Horrigan 2016). Here is a short description of the seven DLD’s:

Information collection is the digital skill of gathering and locating information effectively and efficiently in an electronic context. It is the ability to recognize information needs, access, understand and use information by employing the Internet, professional organization databases and search engines. (Catts and Lau 2008; Nelson et al. 2011; Mioduser et al. 2008; Gilster 1997; Lau and Yuen 2014; Ala-Mutka 2011); Information evaluation stands for the attitude towards the retrieved information, which determines the worthiness of the collected information. It is the ability to evaluate the quality, reliability, relevance, timeliness, completeness, credibility, usefulness, and efficiency of digital resources. (Eshet-Alkali and Amichai-Hamburger 2004; Brouwer 1996; Jenkins 2009; Lau and Yuen 2014; Nelson et al. 2011); Information management denotes data organization and storage for posterior fruitful usage. It is the ability to save, retrieve and to tag digital information while including knowledge about copyright and plagiarism issues (Dudeney et al. 2014; Nelson et al. 2011; Mioduser et al. 2008). More specifically, it represents the ability to protect personal data and information from threats such as unauthorized access, destruction, identity theft, impersonation, unauthorized alteration of data, or fictitious creation (Lau and Yuen 2014; Nelson et al. 2011); Information processing relates to the posterior preparation and arrangement of the information usage in its format: text, sound, image, etc. It is the ability to use ICT to design or create new information from information already acquired (Lau and Yuen 2014); Teamwork refers to the work done by several peers in the process of learning, while sharing information, communicating and participating in given tests, with each party learning, collaborating, and creating a single joint common item. Differently stated, it is the ability to work with others (instructor and peers) toward a common intended learning goal through, discourse, collaboration, cooperation, RBL and PBL. (Jung and Latchem 2011; Harasim 2012; Panitz 1999; Jenkins 2009; Nelson et al. 2011); Integrity awareness relates to the ethical use of gathered information, it involves integrity, honesty and fairness in searching and collecting information, as well as to how new knowledge based on it is created;

Social responsibility: refers to the quality of being a moral and reliable person, involving proper behaviour in the digital context. In other words, it represents understanding the social and ethical implications/consequences of the use of digital resources.

As these SDDL’s have been shown to represent the various domains of digital literacy, their level points to one’s digital readiness.

Purpose of the study

The research was designed to address the gap in the research literature by implementing an empirical measurement of student teachers’ perceptions of their level of Digital Literacy and Digital Readiness. The contribution of this study is the presentation of the findings of a survey that examines the Digital literacy and Digital Readiness of a representative sample of students from five colleges in Israel. The choice of pre-service teachers, i.e., undergraduate students who are in their basic training stage, is important in light of systemic reforms in Israel and worldwide, which promote the adaptation of educational systems to the digital age (Tsybulsky and Levin 2017). In Israel, studies have been conducted to assess the digital skills of various population groups (Eshet-Alkalai and Chajut 2010). Most of them examine specific issues related to the ethnic digital division in the country (Lissitsa and On 2014), the use of the Internet by different ethnic groups (Lissitsa and Chachashvili-Bolotin 2014), or the development of a DL measuring tool (Kurtz and Peled 2016a). Nonetheless, none has examined the broad aspect of DL as the current research does.

The purpose of this research study is to gauge the DL and DR of education students, who are graduate students and pre-service teachers in Israeli colleges. More specifically, this research employs a valid and reliable measure of digital literacy. The Self-Report Digital Literacies (SRDL) is based on a previous research by Kurtz and Peled (2016a), who as we saw earlier identify seven digital literacy domains (SDDLs).

Objective and research questions

This study has several objectives. It investigates pre-service teachers’ self-perception of ICT, (it measures students’ self-perception of the SDDLs) and its subsequent integration in their professional practices. It investigates pre-service ICT readiness and compares it to their actual ICT knowledge. It presents a structural model, which predicts pre-service teacher preparedness to teach embracing digital literacy practices.

Research questions

  1. What is the perceived level of digital literacy of students?

  2. Are there differences in digital literacy types and in student’s digital readiness?

  3. Do background characteristics predict the level of digital readiness?

Method

Survey instrument

The questionnaire used in this study - the Self-Report Digital Literacies (SRDL), consists of 54 statements (see Appendix 1), which are divided into seven domains (see Table 1). The sample comprises 1265 students.

Table 1.

Origin of Domains and Reliability of Questionnaire

Domain Literacy Source Number of items Cronbach’s alpha
Information Collection The ability to Recognize information needs, access, understand and use effectively and efficiently information using Internet, professional organization databases and search engines (Gilster 1997; Lau and Yuen 2014; Catts and Lau 2008; Nelson et al. 2011; Mioduser et al. 2008; Ala-Mutka 2011) 12 0.94
Information Evaluation The ability to evaluate the quality, reliability, relevance, timeliness, completeness, credibility, usefulness, and efficiency of digital resources (Eshet-Alkali and Amichai-Hamburger 2004; Lau and Yuen 2014; Nelson et al. 2011; Brouwer 1996; Jenkins 2009) 12 0.93
Information Management The ability to save, retrieve and to tag digital information while including knowledge about copyright and plagiarism issues (Dudeney et al. 2014; Nelson et al. 2011; Mioduser et al. 2008) 3 0.75
Information Processing The ability to use ICT to design or create new information from information already acquired (Lau and Yuen 2014) 8 0.91
Teamwork The ability to work with others (instructor and peers) toward a common intended learning goal through, discourse, collaborative, cooperative, RBL and PBL (Jung and Latchem 2011; Harasim 2012; Jenkins 2009; Nelson et al. 2011; Panitz 1999) 8 0.91
Integrity Awareness Maintain digital integrity & ethical standards 15 0.93
Social Responsibility Understanding the social and ethical implications/ consequences of the use of digital resources. (Nelson et al. 2011) 3 0.88

Development of the research instrument

The research instrument - The Self-Report Digital Literacies (SRDL) was initially developed by Kurtz and Peled (2016b) in a two phase process: Phase 1 - Based on an exhaustive literature research, a list of DLDs and PSs was compiled (see Table 1) and distributed for pre-validation review and comments to six expert researchers in the educational technology field, and to seven graduate students of ICT studying at the College of Academic Studies in Israel. The experts and the students were asked to provide a critical review of the DLDs and PSs. More specifically, they were requested to respond to open-ended questions concerning the fitness, appropriateness, missing items, revision, rephrasing, and clarity of the items. Their comments were analyzed by the research team to determine what revisions of the DLDs (if any) were to be included in the survey. Based on respondent input, a final set of seven DLD and sixty-five Likert-type scale items from 1 to 5 was listed. The 64 items survey was administered to 1889 students at the Western Galilee College in Israel. The analysis of the data showed that 10 items had low compatibility and were accordingly excluded. Phase 2 – the remaining 54 PSs were retested for reliability showing a relatively high Cronbach’s alpha values as it can be attested in Table 1. The current report relates to phase 2 of the research.

The seven domains (and the number of statements pertaining to each domain in parentheses) are:

(a) information collection (12), which refers to questions such as how to objectively search effectively, how to distinguish between different types of search, sources and information; (b) information evaluation (5), which includes questions on judging the information gathered and assessing its credibility; (c) information management (3), which involves inquiring into personal storage for posterior retrieval; (d) information processing (8), which concerns assessing, interpreting, analysing and synthesizing information from multiple sources for later usage; (e) teamwork (8), which engages the query on participation levels of different peers in a studying task; (f) integrity awareness (15), which asks on the concern of ethical, moral and social consequences of usage, or mis-usage, of digital information; and (g) social responsibility (3), which gathers information on proper behaviour in the social digital environment. All items are assigned 5-point Likert scale from 1 (not at all) to 5 (to a very large extent).

A further evaluation of students’ digital readiness is based on Horrigan’s (2016) work. For the purpose of this study, we categorize the original seven domains of the digital literacy questionnaire (Kurtz and Peled 2016a) into four types of digital readiness: (a) basic order readiness (information collection, information processing); (b) advanced order readiness (information management, information evaluation); (c) preparedness for teamwork (teamwork); and (d) ethical readiness (integrity awareness, social responsibility). The background characteristics of the participants are examined using the following questions: (a) school of study; (b) degree; (c) gender; (d) age; and (e) sense of control of Internet technologies.

Procedure/information collection and participants

We used an online questionnaire to anonymously collect information from five teachers’ colleges/colleges of education in the north, centre, and south of Israel. The recruitment was done with the assistance of the different institution’s managements, which sent the questionnaires via email to their students. A total of 1265 students filled out the questionnaire. 481 students (38.0%) belonged to college A, 375 (29.6%) to college B, 165 (13%) to college C, 133 (10.5%) to college D, and 90 (7.1%) to college E. 21 students (1.8%) did not mark their college. The response rate varied from 37.2% (college A), 75% (college B), 16.5% (college C), 4% (college D), and l6.6% (college E); 57.5% were undergraduate students and 38.5% were graduate students. Most of the respondents (79%) were women. The average age was 33.4 (SD = 10.4; M = 32).

Results

The analysis of undergraduate student’s Self-Report Digital Literacies (SRDL) questionnaire shows the following findings. Concerning (a) information collection (12), results show that most of the respondents feel confident with their ability to collect and retrieve digital information. More than 70% answered they know how to collect information, how to search effectively, and can define the objective of the search. Furthermore, only one third of the respondents expressed their lack of knowledge when identifying file types and defining different types of digital environments (for example blogs, pictures, video clips). Overall, these findings indicate that most participants possess basic information research and retrieval skills.

Concerning (b) information evaluation (5), most of the respondents self-reported a positive ability to validate online information. 77% reported being able to judge the retrieved information and more than 60% testified accuracy of information skills. Only 20% admitted being unable to determine the specific information required for a specific task. This last finding is disturbing as it means that some students perform learning tasks without knowing what is expected from them in terms of allocating relevant learning information.

As to (c) information management (3), the findings show that most of the participants know how to manage their collected data. But contradictorily, only 65% acknowledge the ability to retrieve stored information. Overall, these findings indicate that most participants identify themselves as digitally literate agents in the information management domain.

Undergraduate students self-report on (d) information processing (8) findings show that a small majority (only 67%) can analyse data from multiple sources and synthesize it, 24% reported some lower ability concerning this, and only 9% possesses no ability whatsoever.

Answers on (e) teamwork (8) show a positive preparedness in joint tasks (84% respondents). 82% of the participants reported readiness to share their thoughts and insights with their peers. On the other hand, 46% claim that they prefer to work independently and 30% say they do not like to work with peers on a joint task.

Most of the respondents reported (f) integrity awareness (15). Most of the participants were aware of copyright matters and provided evidence that they will not misuse retrieved information. On the other hand, almost half of the respondents say that they do not cite their sources or that they are aware of the requirement by Creative Commons concept, as 30% of them admitted having downloaded many music or movie files illegally.

Concerning (g) social responsibility (3) 88% of the respondents acknowledged understanding the dangers of the digital environment (for example: cyberbullying). Furthermore, more than 70% of the participants understand the different social and ethical consequences of their online activities and follow the rules of discourse and proper behaviour in social networks. Yet, despite of this high social awareness, a relatively high percentage would not take any action if coming across an inappropriate dialogue online. They would neither report it (29%), nor would they send any comment on an inappropriate dialogue online (21%).

Thus, our SDDL model clarifies the present study’s research questions. The perceived level of digital literacy among undergraduates’ students (RQ1) is high (see Table 1). This notwithstanding, only half of the respondents reported high levels in evaluation and information processing (see Table 2).

Table 2.

Descriptive statistics, Reliability and Pearson Correlations between Students’ Digital Literacy Types (N = 1265)*

Average (SD) Cronbach’s Alpha 1 2 3 4 5 6 7
1. Information Processing

4.04

(.73)

.94
2. Information Evaluation

3.88

(.78)

.88 .76
3. Information Collection

4.06

(.90)

.75 .76 .76
4. Information Management

3.88

(.79)

.91 .52 .48 .50
5. Teamwork

4.22

(.73)

.91 .51 .46 .50 .41

6. Integrity

Awareness

4.24

(.75)

.93 .57 .52 .56 .45 .58
7. Social Responsibility

4.61

(.68)

.89 .40 .39 .43 .41 .53 .66

* Correlations’ significance: p < .001 (two-tailed). **Likert scale from 1 (not at all to) 5 (to a very large extent)

The nature of the relationship between digital literacy types (RQ2) has been calculated by correlation coefficient (Pearson). A positive, moderately significant relationship has been found between types of literacy (Table 2). It has been also found that there is a strong and positive correlation between three literacies that complement students’ online behaviour: collecting, evaluating, and processing digital information.

Students’ background characteristics (RQ3) as ICT readiness predictors - the analysis shows that only one variable—studying for a degree—was significant and explained 19% of the overall level of readiness. T-tests for independent samples indicated that there was no significant difference between undergraduate and graduate students in basic and advanced readiness. However, graduate students reported readiness for high-level teamwork, (F(1,170) = −3.527, p < .005), and high ethical readiness (F(1,180) = −6477, p < .005), compared to undergraduate students.

Due to the results obtained, a new question arises: To what extent are students digitally ready to teach in the digital age? Thus, for this purpose and based on Horrigan (2016), the following four types-categorization of digital readiness was analysed: (a) basic order readiness (information collection, information processing); (b) advanced order readiness (information management, information evaluation); (c) preparedness for teamwork (teamwork); and (d) ethical readiness (integrity awareness, social responsibility). The general results are presented on Table 3. Students reported a high level of readiness for teamwork and ethical conduct relevant to both offline and online environments. On the other hand, the participants reported a medium to a low level of preparedness for the types of preparedness relevant to an online environment only: readiness of a basic and advanced order.

Table 3.

Digital Readiness of Students for Education (N = 1265)*

Readiness Average (SD) Cronbach’s Alpha
1. Basic 3.96 (.71) .84
2. Advanced 3.97 (.73) .86
3. Teamwork 4.22 (.91) .91
4. Ethics 4.43 (.65) .94

Note. * Likert scale from 1 (not at all) to 5 (to a very large extent)

Furthermore, the findings show a distinction between the types of preparedness that have developed in the digital sphere and those that existed in the pre-digital sphere: of the four types of digital readiness: (a) basic order readiness (information collection, information processing); (b) advanced order readiness (information management, information evaluation); (c) preparedness for teamwork (teamwork); and (d) ethical readiness (integrity awareness, social responsibility) students report a high level of readiness for teamwork and ethical conduct relevant to both offline and online environments. On the other hand, the participants reported a medium to low level of preparedness for the types of preparedness relevant to an online environment only: readiness of a basic and advanced order. These findings point to the fact that pre-service teachers are in a transit phase from digital immigrants to digital natives (Prensky 2001). Ten years ago Lei (2009) has pointed out that digital immigrants preservice teachers lack the knowledge, skills, and experiences necessary to integrate technology into classrooms, so that it assists them in their teaching, and helps their students in their learning. The fact that teachers fully recognize the importance of doing so, may suggest that noticeable changes should have been perceived a decade later. Li et al. (2016), found that despite digital native teachers’ great comfort with basic technology, they have not yet integrated it effectively into their teaching. A more effective training is still needed for them to be able to better integrate technology into the classroom. In addition, some of the digital immigrant teachers lack basic technology skills, and therefore need more hands-on practice in basic technology operations. Kurniawati et al. (2018) found that both native and immigrant digital teachers were at an adaptation stage in terms of digital literacy, which is reflected by their utilization of digital media in assisting students’ learning. These evidences support our findings, and lead to the conclusion that it may take another decade to witness true digital immigrants in teaching.

Discussion

The purpose of this study is to assess pre-service teachers’ undergraduates self-reported level of digital literacy and readiness, while focusing on Israeli college students majoring in education. In this study, we have expanded the areas of DL testing into seven different domains of literacy (SDDL: (a) information collection (12), (b) information evaluation (5), (c) information management (3), (d) information processing (8), (e) teamwork (8), (f) integrity awareness (15), and (g) social responsibility (3), as opposed to previous studies that have only examined specific issues such as computer literacy (Wilkinson 2006) or information literacy (Buzzetto-Hollywood et al. 2018).

Although undergraduates perceive themselves as digitally oriented and prepared, they lack the critical means to analyse and judge gathered information and its subsequent management and retrieval. This conclusion may help instructors, institutions, and policymakers to adapt the teaching curricula accordingly in order to fill this gap. In other words, there is a gap between self-perception and actual implementation of digital information. This in turn, may require the construction of specific training and development processes. The lack of proper skills may be further origin a deficit in the instruction of future generations and may also affect their future employment integration in modern society.

Summing up, the twenty-first century world of work and academia (specifically teachers), demands that higher education institutions train graduates who could integrate into the workforce. This study offers a comprehensive tool for assessing the different dimensions of DL and for further understanding DR. The SDDL research tool helps considering undergraduates DL levels. Accordingly, instructors may design their course for the development of pre-service teachers’ digital skills, which in turn, should facilitate their optimal integration as leaders in Israel’s education system. It is important to point out that pre-service teachers (undergraduate students) lack both the experience that in-service teachers have (graduate students) in teamwork and the understanding of ethical issues related to online activity. Thus, it seems that currently novice teachers lack some of the important twenty-first century skills they are expected to teach their future students.

The primary revelation during COVID-19 is the importance of digital readiness and a high level of digital literacy. While the pandemic is disrupting socio-economic activities, it is, fortunately, happening at a time of rapid digitalization. The future of education was already changing before COVID-19. In 2010, the Israeli ministry of education launched the National ICT Program to promote pedagogy and learn in schools using information and communication technologies and their assimilation into the curriculum. In light of the National ICT Program, new programs were introduced to the teacher’s colleges curriculum to prepare the pre-service and in-service teachers to teach according to the program’s objectives. The pandemic has accelerated the pace, need, and uptake of technology in teaching. The sudden necessity for online teaching revealed the need for digital readiness. Reports from schools indicate that many teachers were not ready and did not have the relevant digital literacy to change their teaching methods. The Self-Report Digital Literacies (SRDL) questionnaire used in this research is a simple to use the research tool to help teacher training colleges and schools identify their teacher’s strengths and weaknesses concerning digital readiness and digital literacy. Thus can make the necessary changes to their course plans to accommodate challenges generated by COVID-19 lockdown.

Practical implications

The results show that students studying for a degree in education are partially prepared for optimal functioning in the world of advanced technologies, as well as for the era of the ‘knowledge society’. The present research findings emphasize the need to train teaching staff in higher education institutions in order to give them the best preparation as teachers of the future. It seems that if training for the digital age is not done in a planned and methodical manner, the digital difference will widen, which may affect the performance of graduates in society. Even though prior studies indicate participants tend to overestimate their DL skills, pre-service teachers are not that confident of their basic DL skills. This lack of confidence, based as it may be on potential overestimation, calls for some additional education in teachers DL. In addition, the findings of the study, which offer an updated rating for the examination of DL types, can serve as a tool for teaching staff in institutions of higher education. It can also assist policymakers to develop the necessary types of DL required for students in institutions of higher learning.

Limitations and further research

The objective of this research is to draw conclusions from a sample (random or chosen). Nonetheless, a major limitation of the study stems from the characteristics of the research population, which only includes students in education. In further studies, the sample and the populations examined may be expanded to other academic institutions and fields of study. Another limitation is that it is a self-reporting correlative study. The literature on information literacy assessment repeatedly shows that self-reporting is not a substitute for the examination of people’s actual information management skills (Mahmood 2016). A major complaint against self-assessment is the lack of validity of this measure, as people tend to inflate their information skills. In this context, a recent study has found that participants are overconfident in reporting their competencies compared to their actual performance. Such behaviour is referred to as the Dunning-Kruger effect (Schlösser et al. 2013). One potential risk of this effect is that people with below-proficient skills are unlikely to obtain assistance if they do not recognize their skill’s limitations (Gross and Latham 2012). these individuals are not motivated to undergo training and may be disengaged from classes.

In addition, the questionnaire does not include an actual examination of digital skills and a comparison thereof to perceived skills. It is recommended to incorporate a practical knowledge test that includes scenarios that require respondents to demonstrate their DL in practice. As part of this research team’s ongoing research, we intend to add a practical knowledge test that capable of deepening our understanding of the field.

Appendix 1: Digital literacies survey

Instructions: This questionnaire is designed to learn about your digital literacies by using the following scale: 1. Strongly disagree; 2. Somewhat disagree; 3. Neither disagree nor agree; 4. Somewhat agree; 5. Strongly agree.

1. Data Collection

1. I know when I need to look for information
2. I am able to identify information for research
3. I am able to collect information from the web
4. I can define the objective of the search
5. I can articulate what information I need
6. I know how to search effectively
7. I can define research terms
8. I can distinguish between types of search
9. I can retrieve information from various sources
10. I am able to collect information from databases
11. I am able to re-locate information
12. I can re-locate a specific web page

2. Evaluation of Data

1. I am able to judge the degree to which information is practical or satisfies the needs of the task
2. I am able to determine the information required for a specific task
3. I am able to assess the accuracy of information
4. I am able to assess the credibility of information
5. I am aware of the difference in credibility of information from various sources

3. Data Management

1. When I store a file, I give it a specific name
2. I store my files in designated folders
3. I tag my information

4. Data processing

1. I am able to interpret information from multiple sources
2. I am able to analyse information from multiple sources
3. I am able to synthesize information from multiple sources
4. I am able to write an appropriate response to a post
5. I am able to use ICT to design or create new information from information already acquired
6. I am able to visually organize data for learning purposes
7. I can represent knowledge in a variety of ways such as PPT, website, blogs, etc.
8. I am aware of the difference in written, graphic or video representations

5. Teamwork

1. During the preparation of a joint task I know how to fit in among team members
2. During the preparation of a joint task I share my thoughts and insights with my peers
3. During the preparation of a joint task I know that I have an influence on the work process
4. During the preparation of a joint task I know what is expected of me
5. While performing a joint task I feel that my contribution to the team is meaningful
6. My peers are aware of my abilities and of what I can contribute
7. I have no reservation regarding joint tasks
8. I like to work with my peers on a joint task

6. Integrity awareness

1. I understand the ethical consequences of the use of technology
2. I understand the social consequences of the use of technology
3. I do not acquire digital information, files, programs, databases, etc., via illegal means
4. I do not use technology for purposes that are intimidating or threatening
5. I am aware of the prohibition of illegal file download
6. I am aware of copyright issues.
7. I am aware of appropriate acknowledgement of sources I use
8. I am aware of the danger of being online to my data
9. I am aware of cyberbullying issues
10. I am aware of identity theft issues
11. I am aware of e-theft issues
12. I am aware of the danger from my online activities
13. I am aware of the influence my online data has
14. I am able to identify/avoid online fraud or identity theft situation
15. I am able to protect myself from online predators

7. Social Responsibility

1. I adhere to the rules of discourse and proper behavior in social networks
2. I make sure not to reveal information about organizations without consent
3. I make sure not to hurt others – people and organizations – online

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. Ala-Mutka, K. (2011). Mapping digital competence: Towards a conceptual understanding. Institute for Prospective Technological Studies, 60. http://ftp.jrc.es/EURdoc/JRC67075_TN.pdf%5Cnftp://ftp.jrc.es/pub/EURdoc/EURdoc/JRC67075_TN.pdf.
  2. Alt, D., and Raichel, N. (2018). Digital media literacy skills for building democratic citizenship. In Lifelong citizenship, 43–67. Leiden: Brill.
  3. Aram, D., and Sverdlov, A. (2017). The Early Education System. In The Routledge International Handbook of Early Literacy Education: A Contemporary Guide to Literacy Teaching and Interventions in a Global Context, edited by Natalia Kucirkova, Catherine E. Snow, Vibeke Grøver, and Catherine McBride. London: Routledge.
  4. Bautista A, Oretga-Ruiz R. Teacher professional development: International perspectives and approaches. Psychology, Society, & Education. 2017;7(3):240–251. doi: 10.25115/psye.v7i3.1020. [DOI] [Google Scholar]
  5. Becker, B.W. (2018). Information literacy in the digital age: Myths and principles of digital literacy. School of Information Student Research Journal 7 (2)
  6. Blau I, Peled Y, Nusan A. Technological, Pedagogical and Content Knowledge in One-to-One Classroom: Teachers Developing ‘Digital Wisdom’. Interactive Learning Environments. 2016;24(6):1215–1230. doi: 10.1080/10494820.2014.978792. [DOI] [Google Scholar]
  7. Bresnahan T, Yin PL. Adoption of new information and Communications Technologies in the Workplace Today. Innovation Policy and the Economy. 2017;17(1):95–124. doi: 10.1086/688846. [DOI] [Google Scholar]
  8. Brouwer P. “Hold on a minute Here: What happened to critical thinking in the information age?” Journal of Educational Technology Systems 25 (2) Los Angeles: SAGE Publications Sage CA; 1996. pp. 189–197. [Google Scholar]
  9. Buzzetto-Hollywood N, Wang H, Elobeid M, Elobaid M. Addressing information literacy and Digial divide in higher education. Interdisciplinary Journal of E-Skills and Lifelong Learning. 2018;14:77–93. doi: 10.28945/4029. [DOI] [Google Scholar]
  10. Catts, R., and Lau, J. (2008). Towards information literacy indicators. Unesco.
  11. Davidson, R., and Glassner, A. (2016). Cross-border collaborative learning in the professional development of teachers: Case study - online course for the professional development of teachers in a digital age. In Educational Leadership and Administration: Concepts, Methodologies, Tools, and Applications, 3–4:1348–1379. IGI Global. 10.4018/978-1-5225-1624-8.ch063.
  12. Drossel, K., and Eickelmann, B. (2017). Teachers’ Participation in Professional Development Concerning the Implementation of New Technologies in Class: A Latent Class Analysis of Teachers and the Relationship with the Use of Computers, ICT Self-Efficacy and Emphasis on Teaching ICT Skills. Large-Scale Assessments in Education 5 (1). SpringerOpen: 1–13. 10.1186/s40536-017-0053-7.
  13. Dudeney G, Hockly N, Pegrum M. Digital literacies: Research and resources in language teaching. London: Routledge; 2014. [Google Scholar]
  14. Eshet-Alkalai Y, Chajut E. You Can Teach Old Dogs New Tricks: The Factors That Affect Changes over Time in Digital Literacy. Journal of Information Technology Education: Research. 2010;9:173–181. doi: 10.28945/1186. [DOI] [Google Scholar]
  15. Eshet-Alkali Y, Amichai-Hamburger Y. Experiments in digital literacy. CyberPsychology and Behavior. 2004;7(4):421–429. doi: 10.1089/cpb.2004.7.421. [DOI] [PubMed] [Google Scholar]
  16. Gilster P. Digital Literacy. New York: John Wiley & Sons; 1997. [Google Scholar]
  17. Godbey, S. (2018). Testing Future Teachers: A Quantitative Exploration of Factors Impacting the Information Literacy of Teacher Education Students. College and Research Libraries 79 (5). University of Nevada, Las Vegas Libraries: 611–623.
  18. Gross M, Latham D. What’s skill got to do with it?: Information literacy skills and self-views of ability among first-year college students. Journal of the American Society for Information Science and Technology. 2012;63(3):574–583. doi: 10.1002/asi.21681. [DOI] [Google Scholar]
  19. Gudmundsdottir, G.B., and Hatlevik, O.E. (2018). Newly Qualified Teachers’ Professional Digital Competence: Implications for Teacher Education. European Journal of Teacher Education 41 (2). Taylor & Francis: 214–231. 10.1080/02619768.2017.1416085.
  20. Harasim L. Learning theory and online technologies. Learning theory and online technologies. London: Routledge; 2012. [Google Scholar]
  21. Hong, A.J., and Kim, H.J. (2018). College Students’ Digital Readiness for Academic Engagement (DRAE) Scale: Scale Development and Validation. Asia-Pacific Education Researcher 27 (4). Asia-Pacific Education Researcher: 303–312. 10.1007/s40299-018-0387-0.
  22. Horrigan, J.B. (2016). Digital readiness gaps. Pew Research Center. ERIC.
  23. Instefjord, E., and Munthe, E. (2016). Preparing Pre-Service Teachers to Integrate Technology: An Analysis of the Emphasis on Digital Competence in Teacher Education Curricula. European Journal of Teacher Education 39 (1). Taylor & Francis: 77–93. 10.1080/02619768.2015.1100602.
  24. Israel Central Bureau of Statistics. (2017). Jews by continent of origin, Continent of Birth and Period of Immigration.
  25. Jenkins, H., (2009). Confronting the challenges of participatory culture: Media education for the 21st century. Mit Press.
  26. Julien H. Digital literacy in theory and practice. In: Khosrow-Pour M, editor. Encyclopedia of information science and technology. Pennsylvania: IGI Global; 2018. pp. 22–32. [Google Scholar]
  27. Jung I, Latchem C. A model for E-education: Extended teaching spaces and extended learning spaces. British Journal of Educational Technology. 2011;42(1):6–18. doi: 10.1111/j.1467-8535.2009.00987.x. [DOI] [Google Scholar]
  28. Katzenmeyer M, Moller G. Awakening the sleeping Giant: Helping teachers develop as leaders. Thousand Oaks: Corwin Press; 2009. [Google Scholar]
  29. Kurniawati N, Maolida EH, Anjaniputra AG. The Praxis of Digital Literacy in the EFL Classroom: Digital-Immigrant vs Digital-Native Teacher. Indonesian Journal of Applied Linguistics. 2018;8(1):28–37. doi: 10.17509/ijal.v8i1.11459. [DOI] [Google Scholar]
  30. Kurtz G, Peled Y. Digital learning literacies: A validation study. Issues in Informing Science and Information Technology. 2016;13:145–158. doi: 10.28945/3479. [DOI] [Google Scholar]
  31. Kurtz, G., & Peled, Y. (2016b). Digital Learning Literacies – A Validation Study. In Proceedings of the 2016 InSITE Conference, 13:910. 10.28945/3480.
  32. Lau, W.W.F., and Yuen, A.H.K. (2014). Developing and Validating of a Perceived ICT Literacy Scale for Junior Secondary School Students: Pedagogical and Educational Contributions. Computers and Education 78. Elsevier: 1–9. 10.1016/j.compedu.2014.04.016.
  33. Lei, J. (2009). “Digital natives as Preservice teachers: What technology preparation is needed?” Journal of Computing in Teacher Education 25 (3). Journal of computing in teacher education: 87–97. 10.1093/jahist/jaq049.
  34. Leu DJ, Kinzer CK, Coiro J, Castek J, Henry LA. New literacies: A dual-level theory of the changing nature of literacy, instruction, and assessment. Journal of Education. 2017;197(2):1–18. doi: 10.1177/002205741719700202. [DOI] [Google Scholar]
  35. Li, Y., Wu, S., and Liao, Q. (2016). Differences in information technology literacy between digital immigrant teachers and digital native ones. Distance Education in China 12 (8).
  36. Lissitsa S, Chachashvili-Bolotin S. Use of the internet in capital enhancing ways: Ethnic differences in Israel and the role of language proficiency. International Journal of Internet Science. 2014;9(1):9–30. [Google Scholar]
  37. Lissitsa S, On AL. Gaps close, gaps open: A repeated cross-sectional study of the scope and determinants of the ethnic digital divide. International Journal of Electronic Governance. 2014;7(1):56–71. doi: 10.1504/IJEG.2014.065080. [DOI] [Google Scholar]
  38. Madalińska-Michalak, J., O’Doherty, T., and Flores, M.A. (2018). Teachers and teacher education in uncertain times. European Journal of Teacher Education. Taylor & Francis. 10.1080/02619768.2018.1532024.
  39. Mahmood K. Do people overestimate their information literacy skills? A systematic review of empirical evidence on the Dunning-Kruger effect. Communications in Information Literacy. 2016;10(2):199–213. doi: 10.15760/comminfolit.2016.10.2.24. [DOI] [Google Scholar]
  40. McArthur, T., Lam-McArthur, J., and Fontaine, L. (2018). Digital Literacy. In The Oxford Companion to the English Language, edited by Tom McArthur, Jacqueline Lam-McArthur, and Lise Fontaine, 2nd ed.
  41. Ministry of Education . The National Plan for adapting the education system to the 21st century – Vision and rationale [Hebrew] Jerusalem: Ministry of Education; 2014. [Google Scholar]
  42. Mioduser, D., Nachmias, R., and Forkosh-Baruch, A. (2008). New Literacies for the Knowledge Society. In International Handbook of Information Technology in Primary and Secondary Education, 23–42. Springer. 10.1007/978-0-387-73315-9_2.
  43. Naifeld E, Simon E. Teaching students’ understanding of innovative pedagogy. European Scientific Journal, ESJ. 2017;13(4):15–26. doi: 10.19044/esj.2017.v13n4p15. [DOI] [Google Scholar]
  44. Nelson K, Courier M, Joseph G. Teaching tip: An investigation of digital literacy needs of students. Journal of Information Systems Education. 2011;22(2):113. [Google Scholar]
  45. Neumann MM, Finger G, Neumann DL. A conceptual framework for emergent digital literacy. Early Childhood Education Journal. 2017;45(4):471–479. doi: 10.1007/s10643-016-0792-z. [DOI] [Google Scholar]
  46. Oliveira, C., Lopes, J., and Spear-Swerling, L. (2019). Teachers’ Academic Training for Literacy Instruction. European Journal of Teacher Education 42 (3). Taylor & Francis: 315–334. 10.1080/02619768.2019.1576627.
  47. Panitz, T. (1999). The case for student centered instruction via collaborative learning paradigms. ERIC.
  48. Peromingo M, Pieterson W. The New World of work and the need for digital empowerment. Forced Migration Review. 2018;58:32–33. [Google Scholar]
  49. Prensky, M. (2001). Digital Natives, Digital Immigrants Part 1. On the Horizon 9 (5). MCB UP Ltd: 1–6. 10.1108/10748120110424816.
  50. Priestley, M. (2011). Schools, teachers, and curriculum change: A balancing act? Journal of Educational Change 12 (1). Springer Netherlands: 1–23.
  51. Redmond, P., and Peled, Y. (2018). Exploring TPACK among pre-service teachers in Australia and Israel. British Journal of Educational Technology, September. 10.1111/bjet.12707.
  52. Schlösser, T., Dunning, D., Johnson, K.L., and Kruger, J. (2013). How Unaware Are the Unskilled? Empirical Tests of the ‘Signal Extraction’ Counterexplanation for the Dunning–Kruger Effect in Self-Evaluation of Performance. Journal of Economic Psychology 39. Elsevier: 85–100.
  53. Shamir-Inbal, T., and Blau, I. (2016). Digital literacy skills and the challenge of collaborative culture in higher education: From individual psychological ownership to co-ownership. In Proceedings of the 8th Annual International Conference on Education and New Learning Technologies-EDULEARN2016, The International Academy of Technology, Education and Development IATED, Barcelona, Spain, 9012–9013.
  54. Shannon, S.K. (2017). A mixed methods exploratory study of digital literacies in higher education. Boise State University Theses and Dissertations, no. December. Boise State University.
  55. Sjöberg, L. (2018). The Shaping of Pre-Service Teachers’ Professional Knowledge Base through Assessments. European Journal of Teacher Education 41 (5). Taylor & Francis: 604–619. 10.1080/02619768.2018.1529751.
  56. Tsybulsky D, Levin I. Inquiry-based science education and the digital research triad. In: Levin I, Tsybulsky D, editors. Digital tools and solutions for inquiry-based STEM learning. Hershey: IGI Global; 2017. pp. 140–165. [Google Scholar]
  57. Wilkinson, K. (2006). Students Computer Literacy: Perception versus Reality. Delta Pi Epsilon Journal 48 (2). Delta Pi Epsilon National Office: 108–120.

Articles from Education and Information Technologies are provided here courtesy of Nature Publishing Group

RESOURCES