Skip to main content
PLOS ONE logoLink to PLOS ONE
. 2021 Oct 1;16(10):e0258137. doi: 10.1371/journal.pone.0258137

Online college English education in Wuhan against the COVID-19 pandemic: Student and teacher readiness, challenges and implications

Cuiying Zou 1, Ping Li 1, Li Jin 1,*
Editor: Di Zou2
PMCID: PMC8486088  PMID: 34597337

Abstract

Online education, including college English education, has been developing rapidly in the recent decade in China. Such aspects as e-readiness, benefits and challenges of online education were well-researched under normal situations, but fully online language teaching on a large-scale in emergencies may tell a different story. A survey of 2310 non-English-major college students and 149 English teachers from three types of twelve higher education institutions in Wuhan was conducted to evaluate their readiness for online English education during the COVID-19 pandemic, to figure out challenges encountered by them and to draw implications for future online college English education. Quantitative statistics gathered using two readiness scales adapted from previous studies showed that both cohorts were slightly below the ready level for the unexpected online transition of college English education. The overall level of readiness for students was 3.68 out of a score of 5, and that for teachers was 3.70. Individual differences were explored and reported. An analysis of qualitative results summarized six categories of challenges encountered by the students, i.e. technical challenges, challenges concerning learning process, learning environment, self-control, efficiency and effectiveness, and health concern. Though the students reported the highest level of readiness in technology access, they were most troubled by technical problems during online study. For teachers, among three types of challenges, they were most frustrated by pedagogical ones, especially students’ disengagement in online class. The survey brought insights for online college English education development. Institutions should take the initiative and continue promoting the development of online college English education, because a majority of the respondents reported their willingness and intention to continue learning/teaching English in online or blended courses in the post-pandemic period. They are supposed to remove technical barriers for teachers and students, and assess the readiness levels of both cohorts before launching English courses online. Institutions should also arrange proper training for instructors involved, especially about pedagogical issues. Language teachers are suggested to pay special attention to students’ engagement and communication in online courses.

Introduction

The COVID-19 pandemic has caused enormous transformations in various aspects of the society since its outbreak. One of the greatest impacts is its disruption in education. In China, the spring semester in 2020 for schools at all levels was first postponed [1] and then moved online [2], which was considered a quick-fix to some extent. However, the recent decade has witnessed enormous advancements in online education in China, especially at the tertiary level, which has laid a solid foundation for the online transition. According to statistics, in response to the Ministry of Education’s (MOE) instructions on the deployment of Higher Educational Institutions (HEIs) online teaching to enable students to resume their studies remotely, by February 2nd there had been 22 online platforms in China providing 24,000 online HEI courses free of charge, covering 12 disciplines at undergraduate level and 18 disciplines at higher vocational education level. Teachers had been encouraged to teach online by using various online resources including Massive Online Open Course (MOOC) platforms [2]. With everything moved online, College English, a compulsory course for first- and second-grade non-English majors from HEIs offering degree programs, was no exception. This pandemic-prompted nation-wide online college English education was unprecedented in China. English teachers were forced to make what was thought to be impossible possible at short notice (mostly two weeks’ preparation). This online teaching in the face of COVID-19, or online triage [3], was carried out without sufficient need analysis or readiness evaluation from both the learning and the teaching sides. It was different from planned and well-prepared online education.

Online education offers numerous advantages over traditional face-to-face (F2F) education including flexibility, accessibility, independence, interactivity, multimodality, cost-effectiveness, ubiquitous learning, convenience, and learner-centeredness [4], It has been gaining increasing popularity in recent decades. Even before COVID-19, there was already high growth and adoption in education technology, with global edtech investments reaching US$18.66 billion in 2019 and the overall market for online education projected to reach $350 billion by 2025 [5]. Then with sudden closures of schools, colleges and universities to ensure the safety of communities in many parts of the world, shifting online became the only possible solution to make education continue. Various technological tools experienced a soar in usage since the breakout of the pandemic.

As for language learning environment, however, online instruction has just begun to enjoy the same popularity already experienced within other disciplines in the latest decade. The term online language learning (OLL) can refer to a number of learning arrangements: a Web-facilitated class, a blended or hybrid course, a fully virtual or online course [6]. These delivery formats, which can be synchronous, asynchronous or both, utilize a variety of technologies, including (desktop) videoconferencing, computer-mediated communication (CMC) tools, and Web 2.0 technologies [68]. Online language education has affordances that are different from F2F courses. First of all, it is flexible, adaptive and allows for enhanced, individualized, and authentic materials. Secondly, it can take advantage of communicative tasks and multilingual communities. Lastly, it can also foster and take advantage of autonomous learning and learner corpora [9]. Some studies have proved the effectiveness of online language learning [1012], and some explored such themes as online language learning assessment [1315], learners’ perspectives [16, 17], and language teachers’ professional development [1820]. More recent delivery modes of online language education include formal online language courses, virtual worlds, Language Massive Open Online Courses (LMOOCs), online language learning communities and mobile apps for language learning [21]. In China, LMOOCs, especially for English, though at its infancy, has been gaining popularity in recent years. By searching for the key word “英语” (English) on the top five MOOC platforms in China, a total of 1391 courses appeared, including general English such as College English, English Writing and English for more specific purposes. Many HEIs are developing MOOCs and SPOCs for English education online, but few of them implement fully online teaching.

Generally speaking, for college English courses in Wuhan, online learning is currently acting as a complementary means to classroom teaching. Learning platforms coming along with textbooks and self-developed MOOCs or Small Private Online Courses (SPOCs) are the mainstream tools for English teachers to implement online or blended teaching [22, 23]. Problems may occur, but usually at low frequency and do not cause too many anxieties because there is classroom teaching. However, during the pandemic, online English learning was the only compulsory means rather than a complementary one. Problems occurred frequently, especially at the beginning, and caused anxieties among students and teachers. Some problems might be specific to the pandemic context, others might be common even in the non-pandemic period. Therefore, now that the semester has terminated smoothly and successfully, lessons can be drawn for future development of online college English education. This research aims to draw implications for the development of online college English education through measuring readiness levels of students and teachers for the online transition and probing into the problems they met in this particular context. The following research questions were explored:

  1. To what extent were college students and teachers ready for online English education during the pandemic semester? Are there any individual differences?

  2. What challenges did teachers and students encounter during the pandemic semester?

  3. What were teachers’ and students’ perceptions towards future online English education?

Literature review

Technology acceptance model

Technology acceptance model (TAM) was put forward by Davis [24] and the construct was defined as a set of technology-related attitudes and beliefs that explain a person’s intentions to use and actual use of technology. The model was designed to explain technology acceptance across a broad range of information technologies, and it suggests that a number of factors influence the users’ attitudes towards technology and decisions about how and when to use a new technology, notably perceived usefulness (PU) and perceived ease of use (PEOU). According to Davis [24], PU is the degree to which a person believes that using a particular system would enhance his or her performance and PEOU is the degree to which a person believes that using a particular system would be free from effort.

The TAM model has been frequently empirically tested and proved to be a useful theoretical model to understand and explain use behaviour in information system in many contexts and fields [25]. One of these contexts is education [2528]. TAM was found to be the most common theoretical framework in online learning acceptance [29]. Closely related to the concept of technology acceptance is the readiness to use technology in learning or teaching, which can be called online learning/ teaching readiness and will be discussed in the following sections.

Student readiness for online English learning

Warner, Christie, & Choy [30] have defined readiness for online learning or e-readiness as a measure of students’ inclination toward online delivery modes versus F2F instruction, their competence and tendency to utilize electronic communication, and their ability to undertake self-directed learning. Several studies concluded the significance of e-readiness in online learning from different perspectives. Moftakhari [31] claimed the success of online learning entirely relied on learners’ and teachers’ readiness levels, which seems to be absolute. Piskurich [32] believed low readiness level was the main reason for failure in online learning. Students’ e-learning readiness was proved statistically as a significant predictor of their satisfaction for online instruction [3335]. Therefore, assessing student readiness for online learning is highly relevant prior to delivering a course online either fully or hybridly and promoting student readiness is essential for successful online learning experiences [36]. A typical readiness assessment will assess the student’s ability in adapting to technological challenges, collaborative learning and training as well as the synchronous and asynchronous self-paced learning [37]. A number of studies developed and validated online learning readiness scales, covering similar but not the same dimensions [3841]. Computer self-efficacy, one main component of online learning readiness, is defined as individuals’ perceptions of using a given technology and individuals’ ability to use the technology [40]. Knowles [42] defined self-directed learning, another component, as a process in which individuals take the initiative, with or without the help of others, in diagnosing their learning needs, formulating learning goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes. Learner control, which is also an important factor in online learning readiness, is defined as the degree to which a learner can direct his or her own learning experience and process [43]. Motivation can be divided into two categories: intrinsic motivation, which refers to doing something because it is inherently interesting or enjoyable, and extrinsic motivation, which refers to doing something because it leads to a separable outcome [44]. Motivation towards online learning means having intrinsic and extrinsic desire in using online learning. Another essential dimension for overcoming the limitations of online communication is online communication self-efficacy [40], being defined as how well the learners can express their feelings and measuring the level of understanding of the communication language and culture [45].

Online language learning is different from online learning of other subjects. Unlike other subjects, language is both the medium of instruction and the subject matter of online learning. In the learning process, the learners are supposed to listen, speak, read and write in the language they are learning. Therefore, whether the online learning environment can provide opportunities for the learners to use the language and whether the learners feel free to use it online determines the success of the language course. Tylor and David [46] claimed that little attention had been paid to learner preparedness for online language learning and developed a self-assessment survey tool with indicators of learner autonomy, computer self-efficacy, attitude towards online learning, motivation, and English language self-efficacy. It shed some lights for researchers in this area, but further validation of their model is needed. Marzieh and Salman [47] identified the factors affecting e-learning acceptance and readiness in the context of foreign language learning. Their findings indicated the complex relationships between the perceived usefulness, perceived ease of use, e-learning motivation, online communication self-efficacy and language learners’ acceptance and readiness of e-learning. One limitation of their study is the relatively small sample size. Mehran et al [48] reviewed the limited number of studies on readiness for online language learning and summarized a set of influencing factors which can be broken down into two general categories: demographic variables which incorporate gender, age, grade, nationality, field of study, and technological accessibility/ownership versus non-demographic variables which encompass learner autonomy, motivation, learning style, attitude toward e-learning, language self-efficacy, technological acumen, and online communication skills. Based on these studies, they carried out a survey and found out the students in Osaka University unwilling to take online courses, either fully online or blended. Further studies are still needed, because nationality is a variable [46]. To the best of the researchers’ knowledge, there has been no study conducted to assess Chinese college students’ readiness for online English learning. Fully online teaching for college English was implemented because of the pandemic, and the researchers took this opportunity to carry out the study, in order to have a rough idea about college students’ readiness for online English learning.

Teacher readiness for online language teaching

Concerning instructors, teaching online requires a reconstruction of their roles, responsibilities, and practices [36]. Teacher readiness, or faculty readiness refers to the willingness to prepare, effectively design and facilitate courses within an online environment [49]. Adnan’s [34]study proved a significant relationship between faculty readiness and satisfaction. Understanding the level of teacher readiness for online teaching in an institution is a key component in the journey to successfully facilitating online courses and programs [49]. The level of online instructors’ e-readiness was evaluated using three scales: technical readiness, lifestyle readiness and pedagogical readiness to the e-learning system environment and it was found that the cohort surveyed were more technologically ready than in lifestyle and pedagogically [50]. These studies all focused on online teaching in general. However, specific to language teaching, online teachers need different skills than those who are trained to teach languages in a F2F classroom and they also require different skills compared to online teachers of other subjects [18]. Hampel and Stickler’s research was one of the earliest comprehensive studies focusing on the pedagogical aspects of online language teaching. They specified seven levels of skills required for online language teaching in a pyramid style from lower-level skills such as basic ICT competence, specific technical competence for the software, dealing with constraints and possibilities of the medium to higher-level skills such as online socialization, facilitating communicative competence, creativity and choice and own style [18]. Compton [19] reviewed Hampel and Stickler’s skill pyramid critically and proposed an alternate framework for online language teaching skills which consists of three dimensions: technology in online language teaching, pedagogy of online language teaching and evaluation of online language teaching. These studies are mainly theoretical, focusing on explaining why certain skills are important, without actually measuring whether the language teachers are ready for online language teaching. Based on the findings in previous studies, this research is going to measure the readiness level of English teachers in Wuhan during the pandemic.

Online language teaching during the COVID-19 pandemic

The COVID-19 pandemic brought about studies on emergency online language teaching. Gacs et al. [3] and Ross and DiSalvo [51]) studied how higher education language programs had swiftly moved from traditional to online teaching. Moorhouse [52] and Moser et al. [53] investigated how the form of language instruction has changed during the COVID-19 pandemic. Chung and Choi [54] examined a case of an English language program in South Korea to investigate how the sudden transition to online language teaching has influenced language instructors’ teaching and assessment practice. Maican and Cocoradă [55] analyzed university students’ behaviors, emotions and perceptions associated with online foreign language learning during the pandemic and their correlates. There were also a few studies with the context of China. Lian et al [56] surveyed 529 Chinese university students on their perceptions of authentic language learning (AULL), self-directed learning (SDL), collaborative learning (CL), and their English self-efficacy (ESE) during the online learning period of the COVID-19 pandemic. Zou et al [57] explored three English writing teachers’ engagement with online formative assessment during COVID-19 in three universities in China, and identified three types of teacher engagement, namely disturbing, auxiliary, and integral engagements. Gao and Zhang [58] analyzed in-depth interviews with three English teachers from a Chinese university and found that teachers had clear cognitions about features, advantages, and constraints of online English teaching.

Studies on language learning during the pandemic keep emerging, but no one has studied students’ and teachers’ readiness for the transition from traditional teaching to fully online teaching or the actual problems they met in the process. Therefore, based on the review above, this research, with a combination of the TAM, the online learning components and online language teaching skills introduced by previous research as the theoretical framework, carried out a survey among college non-English-major students and college English teachers in Wuhan. The aim was to assess their readiness levels for shifting from classroom teaching to fully online teaching during the pandemic semester, to analyze the actual challenges they encountered, to understand their perceptions towards future online English learning and teaching, and to draw implications for future development of online college English education. It is hoped this research can bring some insights for the field of online language education as well as remote learning and teaching in emergency situations.

Methodology

Ethical statement

This research was investigation-oriented without revealing any specific personal information, so no ethical agreement was needed. Participants were recruited on a voluntary basis, and there was a sentence-“The survey is anonymous, but if you finish it, the researchers will understand it as a formal consent for using your responses in future publications of the research.”-in the introduction part of the survey. Therefore, no extra formal consent was obtained from the participants. Upon completion of the questionnaires or interviews, they automatically granted use of their responses.

Instrument design

Data in this research were collected using two questionnaires, one for students and another for teachers. Both consist of a demographic information form, a readiness scale and two open-ended questions. To avoid possible misunderstanding, both questionnaires were presented in Chinese. Translation of the questionnaires were done by one of the researchers who has a master’s degree in translation, and double checked by a colleague also with a master’s degree in translation.

The student readiness scale (SRS) was an adapted version of the Online Learning Readiness Scale (OLRS) developed and validated by Hung et al. in 2010 [40]. Both the English and Chinese versions of the scale were shared by the original authors and permission was given to adapt the scale. The OLRS evaluates online learner readiness from five dimensions: computer/internet self-efficacy, self-directed learning, learner control, motivation for learning and online communication self-efficacy. However, stakeholders’ technology access, and infrastructure may greatly impact what is possible [3], especially during the emergency transition when all students and teachers stayed at home without access to any of the school facilities. Therefore, a sixth dimension of technology access with three items was added from a scale developed by Watkins et al. [39]. The wording of the items from the original scales were changed to refer more specifically to English learning. For example, “I carry out my own study plan” was changed into “I carry out my own English study plan”. All items used a five-point Likert-type scale, ranging from ‘‘1 = strongly-disagree” to ‘‘5 = strongly-agree”.

The teacher readiness scale (TRS) used the three dimensions included in Gay’s [50] online instructor e-readiness scale, i.e. technical readiness, lifestyle readiness and pedagogical readiness. 11 items were kept from Gay’s scale with modifications made in diction to refer specifically to English teaching. Another 5 items were added to include the skills required for online language teachers discussed in previous studies [18, 19]. All items used a five-point Likert-type scale, ranging from ‘‘1 = strongly-disagree” to ‘‘5 = strongly-agree”.

The two questionnaires were initially piloted to check clarity of the language used and to ensure the reliability of the two scales in the local context. Both scales were statistically reliable, with Cronbach’s Alpha coefficients being 0.961 and 0.859 respectively. Improvements were made in light of the comments from pilot respondents and two colleagues with expertise in questionnaire design. Improvements included removing one item which was overlapping with another one from the TRS and clarifying language ambiguities. For instance, one item “当我的电脑软硬件出现技术问题时, 有人和/或资源给我提供帮助。” was modified as “在线教学过程中出现技术问题时, 有人(同事, 家人, 平台技术支持)和/或资源(手册, 视频)给我提供帮助。”. The finalized version of the SRS had 22 items in six dimensions and that of the TRS 15 in three. As alpha coefficients of 0.7 or over are considered acceptable [59], both scales were proved to be reliable and valid with overall Cronbach’s Alpha coefficients being 0.974 and 0.957 respectively with the main samples, and each dimension had an Alpha coefficient greater than 0.8. The open-ended questions corresponded each other in the two questionnaires to gather equivalent information from both cohorts. The questions were: “What challenges have you encountered in your online English learning/teaching during this semester?” and “Are you willing to learn/teach English either in a fully online or blended course in the future? Why?”.

In addition to the two questionnaires, a semi-structured interview was designed to get an overall view of the situation before and during the pandemic semester for the sake of better understanding of the descriptive analysis of the survey. There were five questions in the interview, focusing on five aspects, i. e. the situation of online English teaching before the pandemic, the teaching modes and online teaching platforms used during the pandemic, assessment criteria for the online English course and possible plans to develop fully online or blended college English courses. The interview was conducted in Chinese orally or in written form.

Data acquisition

Participants were recruited using purposive sampling and convenience sampling. There are a total number of 46 HEIs in Wuhan offering degree programs, among which 8 are directly under MOE, 15 under Hubei Provincial Department of Education and 23 non-governmental. All these HEIs offer college English courses to first- and second-year students. In order to involve universities at all levels, the director of the School of Foreign Languages from the researchers’ university sent invitations to her counterparts from 19 institutions (four directly under MOE, ten under Hubei Provincial Department of Education and five non-governmental) purposively to recruit teacher and student respondents who teach or learn college English courses. All of them accepted the invitation to help without promising a satisfactory result because participation was voluntary and responses were anonymous.

The questionnaires were disseminated online from July 24th to August 2nd, 2020 after the semester terminated in all universities. Both were posted on one of the most widely-used online survey platforms in China powered by https://www.wjx.cn/ and only those invited by their college English teachers got the access to participate. The survey was set to allow only one submission for the sake of data integrity.

The interviews were conducted during the same time period by the researchers with the personnel in charge of college English education from the sampled universities or colleges on a one-to-one basis through email or online chatting tools. Written answers were copied directly. Oral ones were transcribed automatically by chatting tools first and then checked by one of the researchers before further analysis.

Throughout the survey, personnel in charge of college English education from 16 universities participated in the interview. Among these universities, four were excluded because researchers received no teacher response or less than ten student responses. Among 2351 students who completed the student questionnaire, 15 were from universities from which no interview was conducted, and 26 completed the questionnaire in less than 60 seconds (the researchers tried to finish the questionnaire as fast as they could and determined the minimum completion time acceptable should be 60 seconds). These 41 answers were deleted before analysis. For the teacher questionnaire, 151 teachers completed it, and two of them in less than 45 seconds (with the same determining method mentioned above), which was deleted as invalid for further analysis. Therefore, a sample of 2310 first/second-year students and 149 college English teachers from 12 HEIs (three directly under MOE, six under Hubei Provincial Department of Education and three non-governmental, diverse in disciplinary areas, student enrollment numbers and include both research and teaching ones) was generated. The names of the institutions were de-identified during analysis for confidentiality reasons. The demographic information of the participants was presented in Tables 1 and 2.

Table 1. Demographics for student respondents.

Variable Category n %
Type of institution Directly under MOE 278 12.03
Under provincial department of education 1076 46.58
Non-government HEIs 956 41.39
Disciplinary area Philosophy 5 0.22
Economics 156 6.75
Law 152 6.58
Education 149 6.45
Literature 129 5.58
History 10 0.43
Science 220 9.52
Engineering 606 26.23
Agriculture 24 1.04
Medicine 181 7.84
Military 2 0.09
Administration 398 17.23
Art 278 12.04
Grade First-year 1388 60.09
Second-year 922 39.91
Gender Male 865 37.45
Female 1445 62.55
Area City 1332 57.66
Countryside 978 42.34
Device for online learning Computer (desktop/ laptop) 1160 50.22
Tablet 81 3.51
Smartphone 1069 46.27
Type of internet connection Broadband 148 6.41
WIFI 1775 76.84
3G/ 4G/ 5G 387 16.75
Total 2310 100

Table 2. Demographics for teacher respondents.

Variable Category n %
Type of institution Directly under MOE 48 32.21
Under provincial department of education 75 50.33
Non-government HEIs 26 17.46
Gender Male 22 14.77
Female 127 85.23
Years in teaching 0–5 years 28 18.79
6–10 years 12 8.05
11–15 years 30 20.14
16–20 years 23 15.44
More than 20 years 56 37.58
Experience in online English teaching Almost no 83 55.7
Some 56 37.58
A lot 10 6.72
Mode of delivery during pandemic semester Synchronous 68 45.64
Asynchronous 13 8.72
Combination of the two 68 45.64
Total 149 100

Data analysis

Quantitative data derived from the SRS and TRS were analyzed using SPSS 23.0 and Microsoft Excel. On the one hand, the numerical description of the variables was carried out (means, standard deviations) to understand the overall readiness of students and teachers for online college English learning/teaching during the online migration. On the other hand, comparisons of means with parametric and nonparametric tests were conducted to explore whether there were significant differences between different demographic groups.

Qualitative data obtained from open-ended questions were analyzed using topic and analytical coding [60]. Strict procedures were followed to ensure coding reliability. Firstly, the answers to the four questions were uploaded to the qualitative research program ATLAS. ti 8 as separate documents. Secondly, two researchers went through the documents to have a rough idea and started coding independently. The open coding and auto-coding functions were combined, but the auto-coding results were checked to avoid inappropriateness. Categories kept emerging through the process of coding. When this initial stage of independent coding finished, the two researchers compared their codes and categories to negotiate a final version. Lastly, categories were further analyzed.

Findings

Overview of college English education before and during COVID-19

From the interviews, an overall view about college English education in the sampled universities before and during the pandemic was obtained. The information was not directly related to the research questions, but it could help to better understand the statistical information in the subsequent sections.

Prior to the pandemic, five universities (three directly under MOE, two under provincial department of education) had implemented blended teaching with established online courses, four (under provincial department of education) was trying to integrate blended teaching or students’ online learning with classroom teaching, while the only three non-governmental institutions still relied heavily on traditional F2F teaching.

During the pandemic, all these 12 universities adopted a combination of synchronous and asynchronous activities with the help of existing online courses or newly-recorded lessons. To make teaching go smoothly against all kinds of adversities brough about by the pandemic, different kinds of technical tools were used. The mainstream ones were Chaoxing learning app and MOOC platform (40.16%), Icourse163 MOOC platform (28.78%), QQ (27.58%), WeChat (26.36%), VooVmeeting (22.16%), Tencent Classroom (21.30%), Dingtalk (18.10%) [61]. Teachers generally had their own choices with the tools by negotiating with students. Statistics showed that a teacher used an average of 2.16 platforms of tools, and only a minority of teachers (17.65%) managed to stick with only one platform or tool in teaching [61]. The platforms and tools share some common functions, but they have different strengths. Teachers tended to choose what they were familiar with at first and continued to combine other tools recommended by colleagues or students, especially when problems occurred. This led to the phenomenon of using a mixture of different tools.

In common practice, the final assessment for college English courses combines formative and summative assessments, with the former accounting for 30–50% focusing on assignments, online learning (if there is), class participation, attendance, and the latter 50–70% coming from the result of a final achievement test. When teaching was moved online during the pandemic for a whole semester, assessment became a big concern. Of the 12 universities interviewed, five maintained their former assessment criteria but moved achievement tests online, two delayed assessments according to institutional policy, four adjusted their criteria mainly in increasing the percentage of formative assessment and change question types for testing online. There was one university directly under MOE adopted a more complex approach on the basis of an investigation and analysis of learner characteristics. The assessment for first-year students remained unchanged but postponed while that for second-year students was changed into essay writing and group projects.

When asked about possible plans for online college English education, all personnel in charge of college English education claimed further integration of online teaching with F2F teaching and development of English MOOCs or intention to try blended teaching in the post-pandemic period if there were enough institutional support.

Research question 1: To what extent were college students and teachers ready for online English education during the pandemic semester? Are there any individual differences?

Student readiness

Descriptive statistics from the SRS are presented in Table 3.

Table 3. Means (M) and Standard Deviations (SD) of the SRS (N = 2310).
Dimensions & items M SD
Technology access (TA) 4.03 0.95
TA1 I have access to a computer/tablet/mobile phone. 4.12 0.93
TA2 I have access to a computer/tablet/mobile phone with adequate software (Office software, software for online learning, QQ) for my online learning. 4.09 0.92
TA3 I have access to internet connection (broadband/Wifi/3G/4G). 4.13 0.96
TA4 I have access to stable internet connection (fast, few failures) for my online learning. 3.79 0.98
Computer/Internet self-efficacy (CIS) 3.69 0.93
CIS1 I feel confident in performing the basic functions of Office programs (Word, Excel, and PowerPoint). 3.58 0.95
CIS2 I feel confident in my knowledge and skills of how to manage software for online English learning. 3.66 0.94
CIS3 I feel confident in using the Internet to find or gather information for English learning online. 3.84 0.90
Self-directed learning (SDL) 3.48 0.91
SDL1 I manage time well in English learning. 3.38 0.92
SDL2 I seek assistance when facing English learning problems. 3.42 0.91
SDL3 I carry out my own English study plan. 3.47 0.92
SDL4 I set up goals for my English learning. 3.63 0.89
SDL5 I have higher expectations for my English learning performance. 3.49 0.90
Learner control (in an online context) (LC) 3.47 0.92
LC1 I can direct my own English learning progress online. 3.53 0.89
LC2 I am not distracted by other online activities (online games, instant messages, Internet surfing) or distractions in my learning environment (TV, siblings playing) when learning English online. 3.39 0.96
LC3 I repeated the online instructional materials on the basis of my needs. 3.48 0.90
Motivation for English learning (in an online context) (MFEL) 3.70 0.87
MFEL1 I am open to new ideas. 3.64 0.88
MFEL2 I have motivation to learn English online. 3.81 0.85
MFEL3 I improve from my mistakes. 3.74 0.86
MFEL4 I like to share my ideas with others online. 3.61 0.90
Online communication self-efficacy (OCS) 3.73 0.88
OCS1 I feel confident in using online tools (email, discussion) to effectively communicate with others in English. 3.79 0.88
OCS2 I feel confident in expressing myself (emotions and humor) through texts in English. 3.70 0.89
OCS3 I feel confident in posting questions in online English discussions. 3.71 0.88
Overall 3.68 0.94

All items were measured via a 5-point Likert scale: 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree

The overall level of readiness for online college English learning for the cohort of student respondents was 3.68 out of a score of 5. The highest scores were about technology access, mainly form the first three items. That is to say they had appropriate access to terminal devices with adequate hardware and software to study online during the pandemic semester. Item TA4 got a lower score in this dimension, but still high when compared with other items. This is due to the increasingly rapid development of information technology in China in recent years and the popularity of smartphones. As can be referred back to Table 1, 46.27% of the respondents had smartphones as their devices for online learning and more than three quarters of them used WIFI connection. It should be admitted that cases were reported as some students from extremely underprivileged places or families had great difficulty managing to attend classes online but such cases were very rare among college students.

The respondents reported the lowest readiness score in self-directed learning and learner control dimensions, 3.48 and 3.47 respectively. Further analysis revealed that the lowest scores came from item SDL1 (I manage time well in English learning.) in self-directed learning dimension with an average score of 3.38 and item LC2 (I am not distracted by other online activities—online games, instant messages—or distractions in my learning environment—TV, siblings playing—when having English class online.) in learner control dimension with an average score of 3.39.

As for readiness in the rest three dimensions of computer/internet self-efficacy, motivation for learning and online communication self-efficacy, scores were around 3.7. Judging from the total mean, it is possible to say that on average these student participants were moderately ready for online college English learning during the pandemic semester.

Individual differences in student readiness

Independent samples t-tests were conducted to explore the differences between different graders, genders and students from different areas in their overall level of readiness. No significant differences between different graders (t = -0.75, df = 1885.76, p = 0.45) and genders (t = 0.44, df = 1499.51, p = 0.66) were revealed. The independent samples t-test was significant (t = 7.25, df = 2280.27, p<0.01) on students from different areas as shown in Table 4. Students coming from cities rated higher scores in all the six dimensions of the SRS, especially in the dimensions of technology access and computer/internet self-efficacy, leading to a higher level of overall readiness (M = 3.78, SD = 0.78) than those coming from the countryside (M = 3.56, SD = 0.64).

Table 4. Independent samples t-test of area.
Area (Mean ± SD) t df p
Dimensions Countryside (n = 978) City (n = 1332)
Technology access 3.90±0.84 4.13±0.88 -6.43 2163.62 0.00**
Computer/Internet self-efficacy 3.54±0.76 3.80±0.89 -7.49 2254.30 0.00**
Self-directed learning 3.35±0.71 3.57±0.87 -6.61 2280.55 0.00**
Learner control 3.35±0.75 3.55±0.89 -5.82 2265.69 0.00**
Motivation for learning 3.60±0.70 3.78±0.84 -5.70 2268.26 0.00**
Online communication self-efficacy 3.62±0.77 3.82±0.87 -5.83 2230.32 0.00**
Overall 3.56±0.64 3.78±0.78 -7.24 2280.27 0.00**

**p<0.01.

Kruskal Wallis test showed statistically significant differences between groups from different types of institutions on levels of readiness (H(2) = 88.21, p = 0.00) with a mean rank of 1385.53 (median = 3.95) for students studying in universities directly under MOE, 1223.10 (median = 3.77) for those from institutions under provincial department of education and 1012.53 (median = 3.55) for those from non-governmental institutions. Post hoc tests (Mann Whitney U tests) were performed to make pair-wise comparisons. Descriptive statistics showed that students from universities directly under MOE (median = 3.95, mean rank = 754.44) scored higher than those from institutions under provincial department of education (median = 3.77, mean rank = 657.62). Mann-Whitney U-value was found to be statistically significant U = 128174.00 (Z = -3.69), p<0.01, and the difference between the two groups was small (r = 0.1) [62]. Students from universities directly under MOE (median = 3.95, mean rank = 770.58) also scored higher than those from non-governmental institutions (median = 3.55, mean rank = 572.98). Mann-Whitney U-value was found to be statistically significant U = 90327.00 (Z = -8.14), p<0.01, and the difference between them was a little below moderate (r = 0.23) [62]. The last pair was students from institutions under provincial department of education and non-governmental institutions. The former group (median = 3.77, mean rank = 1103.98) scored higher than the latter (median = 3.55, mean rank = 918.04). Mann-Whitney U-value was found to be statistically significant U = 420201.50 (Z = -7.14), p<0.01, and the difference between them was a little below moderate (r = 0.16) [62].

Kruskal Wallis test also revealed differences between groups using different terminal devices (H(2) = 104.50, p = 0.00) with a mean rank of 1289.17 (median = 3.86) for students using computers, 1254.58 (median = 3.86) for those using tablets, and 1002.94 for those using smartphones. However, post hoc comparisons showed no statistical difference between students using computers and tablets (p = 1.00).

In addition, Kruskal Wallis test revealed differences between groups using different internet connections (H(2) = 74.01, p = 0.00) with a mean rank of 1266.29 (median = 3.86) for the broadband group, 1203.76 (median = 3.77) for the WIFI group and 891.79 (median = 3.45) for the 3G/4G/5G group. Post hoc comparisons showed no statistical difference between students using computers and tablets (p = 0.82).

While existing studies rarely explored whether students of different disciplinary areas had different levels of readiness for online learning, this research took this aspect into account. Kruskal Wallis test revealed differences between groups with different disciplinary areas (H(12) = 79.12, p = 0.00). However, because of the limitation of the sample, some disciplinary areas had very limited number of respondents, making the results less convincing. To have a rough idea, the overall readiness levels were presented in Fig 1. If excluding areas in which the number of respondents were less than 100, students majoring in art, economics and engineering appeared to be readier than those in medicine and education.

Fig 1. Overall readiness (M) of students from different disciplinary areas.

Fig 1

Teacher readiness

Descriptive statistics from the TRS are presented in Table 5.

Table 5. Means (M) and standard deviations (SD) of the TRS (N = 149).
Dimensions & items M SD
Technical readiness (TR) 3.73 0.90
TR1 My computer setup is sufficient for online English teaching. 3.85 0.96
TR2 My internet connection is stable for online English teaching. 3.67 0.90
TR3 I know how to use software and online teaching platforms to carry out and facilitate online English teaching. 3.68 0.85
Lifestyle readiness (LR) 3.60 0.98
LR1 I have a private place in my home or at work and that I can use for extended periods. 3.60 1.00
LR2 I have adequate time that will be uninterrupted in which I can work on my online courses. 3.52 0.98
LR3 I have people (family members, colleagues) and/or resources (manuals, videos) which will assist me with any technical problems I might have with my software applications as well as my computer hardware. 3.30 1.08
LR4 I value and/or need flexibility of online teaching. 3.97 0.86
Pedagogical readiness (PR) 3.75 0.92
PR1 When I am asked to use technologies that are new to me such as a learning management system or a piece of software, I am eager to try them. 3.85 0.87
PR2 I am a self-motivated, independent learner. 3.78 0.90
PR3 I can effectively design an online English class. 3.68 0.87
PR4 It is not necessary that I be in a traditional classroom environment in order to teach English. 3.92 0.88
PR5 I communicate with students effectively and comfortably online. 3.74 0.92
PR6 I feel confident in engaging students in online English teaching. 3.57 0.95
PR7 I feel comfortable checking students’ assignments and providing different kinds of feedback online. 3.72 0.99
PR8 I have a positive attitude toward teaching and learning English online. 3.72 0.94
Overall 3.70 0.94

All items were measured via a 5-point Likert scale: 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree

The overall score of readiness for online college English teaching for the cohort of teacher respondents was 3.70 out of 5 based on the means of scores for three dimensions of technical readiness (3.73), lifestyle readiness (3.60) and pedagogical readiness (3.75). The teachers seemed to be more technically and pedagogically ready than being ready in lifestyle as getting access to and acquiring skills of using technical tools, and learning to teach online were less challenging than changing one’s lifestyle at short notice when there were great emotional tensions during the pandemic. Though for item LR4, 79.86% of the respondents agreed or strongly agreed with the statement “I value or need the flexibility of online teaching”, around 50% of them rated themselves as not “having a private place to teach online (item LR1)”, being unable to “work uninterruptedly for an extended period of time (item LR2)” and not “having people (family members, colleagues, friends or technical support from business) or resources (brochures or videos) to help them with technical problems (item LR3)”. Though the teachers reported higher scores of readiness in the pedagogical readiness dimension, a further probe exposed a low score in item PR6 “I feel confident in engaging students in my online class” (M = 3.57). Only 56% of the respondents showed confidence in engaging students in online class. This echoed with the findings in a previous study that 62.96% of online instructors had encountered the problem of poor involvement of students in online classes during the pandemic [63].

Individual differences in teacher readiness

An independent samples t-test revealed no significant difference between male teachers and female ones in their readiness levels (p = 0.55).

The ANOVA tests results were not statistically significant on types of institution (p = 0.11), levels of online teaching experience (p = 0.95) of the teacher respondents and different delivery modes (p = 0.02). Groups with different years in teaching didn’t report different levels of readiness according to Kruskal Wallis test.

Research question 2: What challenges did teachers and students encounter during the pandemic semester?

Challenges from students’ perspective

For the open-ended question “What challenges have you encountered in your online English learning during this semester?”, a total of 2310 answers were obtained, among which 1493 were meaningful and valid (The answers with only one word or meaningless signals were deemed invalid and deleted, while those using phrases or sentences were kept for analysis). Through careful and repeated reading, analysis and coding, six discernible categories of challenges reported by students emerged (Table 6).

Table 6. Categories of challenges reported by students (codes and frequencies).
Technical challenges (427) Learning process (269) Learning environment (146) Self-control (139) Efficiency and effectiveness (123) Health concern (14)
Internet connection 224 Communication 98 Distractions 78 Lack of self-discipline 110 Low efficiency 72 Eye fatigue 14
No textbook 102 Interaction 67 No learning atmosphere 34 Time management 29 Undesirable effectiveness 51
Devices for learning 44 Assignment 39 Low motivation 28
Apps or software 37 Examination 20 Lack of teacher presence 6
Cost of flow data 12 Speaking 16
Power failure 8 Note-taking 15
Listening 14

The most frequently reported challenges were concerned with technical issues and different learning processes. For technical challenges, 224 out of 1493 respondents mentioned poor internet connection and 102 thought learning without printed textbooks was difficult. There were also students having problems with their learning devices, apps or software and unexpected power failure, or complaining about their high cost of flow data. In terms of learning process, the biggest problem was difficult and unsmooth communication (Freq. = 98) as well as a lack of interaction with the teacher and peers (Freq. = 67). 39 respondents thought assignments for the online English course were heavier than before. Other challenges in this category included bad experiences in online examinations, inconvenience in taking notes during online classes and limitations in developing speaking and listening skills. Take one response for example: “When learning English as a foreign language, we need to communicate with others orally and speak in front of the public as much as possible, but in online English classes we had neither opportunity.” For learning environment and self-control, 110 respondents admitted their lack of self-discipline because they were invisible behind the screen and 78 were troubled by various kinds of distractions around. This corresponded with their low readiness score in learner-control dimension as analyzed in Section 4.2.1. One student said “I was often called to do some housework and couldn’t concentrate in class”. Another complained “My little brother kept interrupting me when I was in class”. Participants also commented on the relatively low efficiency and undesirable effectiveness of online English classes compared with F2F ones. 14 students were worried about their worsening eyesight because of too much screen time, which was a problem not to be overlooked despite its low frequency.

Challenges from teachers’ perspective

For the open-ended question “What challenges have you encountered in your online English teaching during this semester?”, all the 149 answers from the teacher respondents were meaningful and kept for analysis. Three categories of challenges stood out after careful coding (Table 7).

Table 7. Categories of challenges reported by teachers (codes and frequencies).
Pedagogical challenges (128) Technical challenges (52) Lifestyle challenges (6)
Students’ disengagement 58 Internet connection 27 Teaching environment 4
Tracking students’ learning 37 Apps or software 13 Health concern 2
Students’ lack of self-discipline 16 Devices for teaching 12
Workload 12
Students’ assignments 5

The teachers’ main difficulties during the online semester pertained to pedagogical issues, with a total number of 128 cases reported. The biggest concern was students’ disengagement in online classes (Freq. = 58), followed by difficulties in tracking how well the students have learnt (Freq. = 37). 16 teachers mentioned that students’ lack of self-discipline made teaching less effective; 12 thought shifting online had brought about more additional work; five were struggling with reviewing students’ assignments online. More than one third of the teacher respondents encountered technical challenges, such as poor internet connection (Freq. = 27), difficulty in managing teaching software or apps (Freq. = 13) and low configuration of their devices for teaching (Freq. = 2). Except for pedagogical and technical challenges, a few cases were related to teaching environment and health concern. The following are some quotations from the answers:

“The success of an online class largely depends on the students’ self-discipline. Without teacher presence as in a F2F class, only those learners with self-discipline can learn well. Those who cannot control their own behaviors or keep themselves concentrated will drift away and lag behind.”

“The challenges involve unstable internet connection, platform’s not being user-friendly and lack of comprehensive training for online teaching. In most cases, we are learning everything by ourselves from scratch and it is extremely time-and-energy-consuming.”

“There are abundant teaching platforms to choose from, each having its own advantages, but this makes teaching and learning more complicated and demanding.”

“How to engage students in online classes is a big headache because it’s difficult to carry out class activities. Some students are too shy to turn on their microphone, especially in speaking tasks.”

“My laptop configuration is low and I have no private space at home for teaching. What’s worse, I have to take care of my child and help with his online learning on weekdays too!”

Research question 3: What were teachers’ and students’ perceptions towards future online English education?

From previous analysis, it was found that neither the students nor the teachers were perfectly ready for online English education. They also met a lot of problems during the process, some of which were specific to the pandemic context, and some were not. Do low readiness level and meeting problems lead to avoidance of online English education? The open-ended question “Are you willing to learn/teach English either in a fully online or blended course in the future? Why?” yielded useful information for reflection.

Among the 2102 students who gave valid responses to this question, 1144 expressed their willingness to learn English either in a fully or blended course in the future, 72 took a neutral attitude, 563 opposed the idea and the rest didn’t show a clear position. Those who were in favor of online or blended English courses commented on the following five categories of advantages: flexibility, abundant resources, access to recorded classes, high efficiency and effectiveness. Meanwhile, those with opposing positions listed seven disadvantages: ineffectiveness, low efficiency, lack of self-discipline, lack of learning atmosphere, distractions, inconvenience, and being bad for eyes.

For the 149 teachers, 102 reported their willingness and plan to implement online or blended English teaching in the future because they had seen the complementary advantages of online teaching over F2F classes, experienced the flexibility it offered and its high efficiency. Some mentioned that going online was an overall trend in education that they were willing to and should follow. 17 had different attitudes for different situations as they would be happy to embrace the idea of online teaching in emergencies as we had because of COVID-19 but they still preferred F2F teaching. Only 16 claimed that they were unwilling to continue online teaching due to poor students’ engagement, difficult class management and ineffectiveness. The rest were ambiguous in their attitudes without giving too many comments.

Discussions and implications

This section will briefly summarize and explain the research findings listed above and draw implications for all relevant stakeholders in the process of online English education.

Level of readiness and challenges encountered

According to Holsapple and Lee-Post [64], a student or a teacher can be considered e-ready with a mean score of 4 on a five-point Likert-type scale. As the overall means of the student respondents and the teacher respondents were 3.68 and 3.7 respectively, both cohorts were moderately ready, or rather, slightly below the ready level. Contrary to what one might expect, these digital natives could not be considered fully digitalized as proved by their lower score in computer/internet self-efficacy. This was also proved by Mehran et al [48], whose research found the digital natives in Osaka University had low digital literacy and competence for educational purposes. Moreover, students’ unpreparedness was closely related to self-directed learning and learner control, which was consistent with previous studies [33, 65]. They displayed their unpreparedness not only through low scores, but also by reporting challenges about distracting learning environment, lack of self-discipline, and inability in time management. As Vonderwell and Savery [36] put it: self-directed learners know how to learn, how they learn, how to reflect on their learning, how to initiate learning and how to use time management skills efficiently. Mastery of these skills enable online learners to make efficient use of their time and resources available online. Agonács et al [66] proved that learners with higher self-directed learning readiness tend to have higher insight, self-efficacy and information navigation skills. Lian et al [56] suggested that students should develop both self-directed and collaborative learning skills in order to achieve meaningful learning in a technology-mediated authentic language learning environment both in or beyond the pandemic. Therefore, it is essential that instructors attach due importance to the cultivation of students’ metacognitive skills related to self-directed learning, self-control or self-discipline and motivation, except for giving students orientation in technology aspects.

The analysis found no statistically significant difference in overall level of readiness between male and female students, which was proved by Hung et al [40]. No difference was found between freshmen and sophomore, either, which was inconsistent with the finding by Hung et al [40]. In their study different graders were found to have different levels of readiness. Contrastively, students from cities reported higher levels of readiness, not only in the overall scores, but in scores of every dimension. Despite the fact of overall development in China, it should not be neglected that gaps in development of all aspects including infrastructure, information technology, first and secondary education still exist between urban and suburban areas. It is highly possible that these gaps led to a difference in the readiness levels of students from different areas. Moreover, types of terminal devices and internet connection also seemed to be a variable impacting students’ level of readiness. Students with better convenience, i.e. who had a computer or a tablet with broadband or WIFI connection were the readiest while those using smartphones connected by flow data appeared to be less ready. The mean scores were significantly higher for the advantageous groups especially in the technology access and computer/internet self-efficacy dimensions, proving stakeholders’ technology access, and infrastructure may greatly impact what is possible [3]. The study failed to consider previous experience with online education as a variable, but students coming from different types of institutions were found to have different levels of readiness. The reason might be the fact that the three sampled non-governmental universities relied fully on traditional teaching for college English education while the others had already implemented blended teaching or online learning prior to the pandemic. There were statistically significant differences in overall readiness across different disciplinary areas. The reason may lie in the fact that for some disciplines, computer skills is more important. Therefore, students from these disciplines basically had better computer and internet self-efficacy.

What drew our attention was students’ opposing perceptions towards the efficiency and effectiveness of online English courses. Some students claimed that online English learning had advantages of high efficiency and effectiveness, while others emphasized its low efficiency and ineffectiveness. By looking at the respondents’ scores of readiness and their answers to the open-ended questions, it was found that students who had lower scores of readiness in the self-directed learning, learner control and online communication self-efficacy dimensions generally considered online English courses less efficient and effective. These students usually lack self-discipline and can not regulate their own learning without the teacher’s presence. As a result, they found they were benefiting less from the online English courses. Also, students who reported poor internet connection or software and hardware failures thought online learning less efficient. In large-scale online classes, problems may occur unexpectedly. And when they occur, it would take the teacher and the students longer time to address them. However, students who were capable of self-control and used to communicating in English online verbally or in written form viewed high efficiency and effectiveness as advantages of online English learning, because they could have their own pace of learning and have more chances to communicate with others. As for the cohort of teacher respondents, their low readiness level also needs great attention. Though they reported lower scores of technical readiness and lifestyle readiness, these problems are easier to be solved. In normal situations, staff members will have full access to all the school facilities. If the institutions give due importance to the infrastructure development, these problems will be addressed.

What need to be discussed were teachers’ relatively high level of pedagogical readiness and the frequently reported pedagogical challenges. Teachers claimed to be positive towards online English teaching and were willing to learn new technology and skills. They also thought themselves capable of designing online English classes. However, during the pandemic semester, their primary concern was the outcome of teaching. In line with their low readiness score in students’ engagement, the biggest challenge they met was how to effectively engage students in online classes. The problem was not only perceived by the teachers, but also reported by the students as bad communication and poor interaction. This problem was not unique to the current context, Maican and Cocoradă’s [55] research also proved that the degree of participation and interaction was the most extensive theme reported by online foreign language learners during the pandemic in the context of Romania. Engagement, interaction, and communication can determine the success of an online course and the learner’s learning performance [67, 68]. For English learning, engagement, participation, interaction or communication, whatever the name is, it is extremely important as Sun [20, 65] and Yang [68] emphasized that learner participation and interaction is in the central place and of crucial importance in successful language learning, whether it’s face-to-face, blended or fully online teaching. Language learning is a skill-based process rather than a content-based one. Skill developments, such as the acquisition of speaking and listening skills, required constant synchronous interaction in the target language [69]. This was coherent with the students’ responses referring to challenges in developing listening and speaking skills in online English classes. To solve the problem, efforts from both cohorts should be made. According to Moore’s [70] interaction framework, there are three types of interaction in teaching and learning: learner-to-learner, learner-to-instructor and learner-to-content [67]. Therefore, teachers should delicately choose teaching contents and design activities that facilitate these types of interaction. Gacs et al. [3] also suggested using teaching platforms or tools that could support a communicative environment and taking advantage of authentic materials and multilingual online communities. Wang and Chen [69], and Aelterman et al [71] advocated real-time synchronous interaction in distance language learning, which was used by about 90% of the respondents in the survey. For students’ part, they need to be more motivated to participate in class activities and interact with peers and the instructor in order to practice English. The goal requires teachers’ planning, guidance and encouragement, but most importantly, it can only be achieved through students’ cooperation, as they are the center of teaching. For other challenges less frequently reported, such as assessment and evaluation, more factors should be taken into consideration in planning online or blended English courses.

Implications for educational settings

Contrary to previous studies which summarized participants’ unwillingness to take fully online or blended English courses [48, 72], only about 25% of the student respondents and 10% of the teacher respondents clearly opposed the idea of online or blended English courses, though they encountered and complained about so many challenges and rated themselves as slightly below the ready level. It is possible that despite all the difficulties, students and teachers who passively experienced online education in this unexpected situation still enjoyed and appreciated its advantages. As some respondents commented, moving online was also a trend of the times, and keeping up with the current was important for institutions, teachers and learners. It is necessary that all stakeholders in the educational setting work alongside to follow the trend and benefit from it.

Readiness is an important factor in the process of online language learning [27, 48, 67]. In order to ensure the success of an online English course, the prerequisite is to ensure that the students are ready for online learning. Forcing those who are not ready may cause them to have a negative view towards online English education, just as the survey has shown that 25% of the respondents who were forced to learn English online because of the pandemic clearly opposed the idea of online English learning. Measures should be taken to assess the students’ readiness before designing and launching fully online courses or implementing blended teaching. Faculty members should provide students with sufficient training that can help them to improve their knowledge and skills relevant to the factors influencing readiness. Fan and Zou [73] emphasized the influence of both task designs and students’ technology knowledge on the effectiveness of technology enhanced collaborative language learning. It is imperative that language teachers not only pay attention to students’ technology skills, but pay more attention to pedagogical concerns. Because language teaching is very different from other courses, language teachers, when moving online, should be more cautious about their teaching approaches. They should design better course contents and activities in order to engage students and develop different language skills. What’s more, different technologies can cater for the development of different language skills [73], so teachers should select appropriate technologies according to the focus of their courses and the target students. Experience is always gained through practice. English teachers should keep reviewing and reflecting on their practice and remain committed to change [20]. It is hoped that more English teachers can reflect on this forced and crisis-prompted online teaching experience and begin to get more training and more hands-on experience in developing fully online or blended courses.

Institutions are suggested to provide adequate infrastructure (including office areas, technological tools, efficient course management system and sufficient budgets) and additional training and support for teachers who want to develop online courses. They should also rethink the workload and evaluation issues of teachers who teach online or blended courses, because designing and implementing an online course usually is more time- and energy-consuming.

There are also IT companies who play a role in the development of online education. It is necessary that companies make online learning and teaching tools more responsive to students’ and teachers’ needs, and enhance the interactivity of these tools. As using different tools in a single course will complicate the teaching and learning process, it is hoped that more comprehensive tools with a combination of different functions be developed. Companies should also provide clear instructions on how to integrate the tools in teaching and guarantee timely technical support for the users.

Conclusion

This inquiry assessed student and teacher readiness for pandemic-prompted online college English education in Wuhan during the spring semester in 2020, explored challenges encountered by them and their perceptions towards future online English education.

For research question 1, both cohorts rated themselves as slightly below the ready level for this emergency migration online for English education. Individual differences exist between students from different areas and types of institutions, using different terminal devices and internet connection. Students from different disciplinary areas also showed different levels of readiness. For teachers, no statistical difference was revealed between different genders, types of institutions, experience in online teaching and years in teaching.

For research question 2, various categories of challenges were reported, some of which were related to personal skills and individual differences while some were more environment-bound. The students experienced more technical problems and challenges in the learning process, while the teachers were more frustrated with the problem of student engagement. Lacking self-discipline was also a prominent problem experienced by the students and sensed by the teachers.

For research question 3, having experienced a fully online college English course for a whole semester, a majority of the respondents showed their positive attitude toward online English education and claimed willingness to continue learning or teaching English in online or blended courses in the future. Implications for educational settings were provided based on the analysis of the research results.

There are three limitations associated with our research. First, despite the effort in covering respondents of a wider range, the samples were not evenly distributed and the teacher sample was not big enough. Second, the second question “Are you willing to learn/teach English either in a fully online or blended course in the future? Why?” included two concepts-“fully online course” and “blended course”-and didn’t define the meaning of “future”, which elicited more complicated results to be analyzed. Some respondents were actively commenting the two kinds of online courses and showing preference towards one of them, and some mentioned willingness to attention classes online in case of emergency but claimed unwillingness to do it as a new normal. Third, the group comparisons are not well theoretically based and the differences could be traced back to other third variables which was overlooked in the research. Based on this research, follow-up studies can explore the correlation between students’ readiness level and their language learning outcomes, and how to engage students and develop a specific language skill in online English classes. How to assess and evaluate students’ language learning outcomes online is also of great value. In addition, language teachers’ professional development concerning online teaching can also be a possible research area.

Supporting information

S1 Table. Demographics for student respondents.

(TIF)

S2 Table. Demographics for teacher respondents.

(TIF)

S3 Table. Means (M) and Standard Deviations (SD) of the SRS (N = 2310).

(TIF)

S4 Table. Independent samples t-test of area.

(TIF)

S5 Table. Means (M) and Standard Deviations (SD) of the TRS (N = 149).

(TIF)

S6 Table. Categories of challenges reported by students (codes and frequencies).

(TIF)

S7 Table. Categories of challenges reported by teachers (codes and frequencies).

(TIF)

S1 Fig. Overall readiness (M) of students from different disciplinary areas.

(TIF)

S1 File. Questionnaire and interview results.

(ZIP)

Acknowledgments

We would like to thank all the teachers and students who participated in the survey and special gratitude goes to Professor Hung from National Chiao Tung University for sharing the Chinese and English versions of the OLRS.

Data Availability

All relevant data are within the manuscript and its Supporting Information files.

Funding Statement

This work was funded by Education Research Project of Department of Education of Hubei Province (Grant No. 2020688), Education Research Project of Wuhan Business University (Grant No. 2021N031) and Scientific Research Project of Wuhan Business University (Grant No. 2021KY007). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1.Ministry of Education [Internet]. MOE postpones start of 2020 spring semester. 2020 Jan 29 [cited 2020 Sep 9]. Available from: http://en.moe.gov.cn/news/press_releases/202001/t20200130_417069.html.
  • 2.Ministry of Education [Internet]. MOE issues instructions for deployment of HEI online teaching. 2020 Feb 7 [cited 2020 Sep 9]. Available from: http://en.moe.gov.cn/news/press_releases/202002/t20200208_419136.html.
  • 3.Gacs A, Goertler S, Spasova S. Planned online language education versus crisis‐prompted online language teaching: Lessons for the future. Foreign Language Annals, 2020; 53(2): 1–13. [Google Scholar]
  • 4.Moore MG, Diehl WC, editors. Handbook of distance education. 4th ed. New York: Routledge; 2019. [Google Scholar]
  • 5.Li C, Lalani F. The COVID-19 pandemic has changed education forever. This is how [Internet]. Geneva: World Economic Forum; 2020. [cited 2020 Sep 9]. https://www.weforum.org/agenda/2020/04/coronavirus-education-global-covid19-online-digital-learning/. [Google Scholar]
  • 6.Blake RJ. Current trends in online language learning. Annual Review of Applied Linguistics, 2011; 31(1): 19–35. [Google Scholar]
  • 7.Blake RJ. The use of technology for second language distance learning. The Modern Language Journal, 2009; 93: 822–835. [Google Scholar]
  • 8.Wang S, Vásquez C. Web 2.0 and second language learning: What does the research tell us? CALICO Journal, 2012; 29(3): 412–430. [Google Scholar]
  • 9.Goertler S. Normalizing online learning: Adapting to a changing world of language teaching. In: Ducate L, Arnold N, editors. From theory and research to new directions in language teaching. England: Equinox; 2019. p. 51–92. [Google Scholar]
  • 10.Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies [Internet]. 2009. [cited 2020 Sep 9]; Available from: https://core.ac.uk/download/pdf/5991.pdf. [Google Scholar]
  • 11.Plonsky L, Ziegler N. The CALL-SLA interface: Insights from a second-order synthesis. Language Learning & Technology, 2016; 20(2): 17–37. [Google Scholar]
  • 12.Goertler S, Gacs A. Assessment in online German: Assessment methods and results. Die Unterrichtspraxis/Teaching German, 2018; 51(2): 156–174. [Google Scholar]
  • 13.Lin CH, Warschauer M. Online foreign language education: what are the proficiency outcomes? Modern Language Journal, 2015; 99(2): 394–397. [Google Scholar]
  • 14.Deusen-Scholl VN. Assessing outcomes in online foreign language education: what are key measures for success? Modern Language Journal, 2015; 99(2): 398–400. [Google Scholar]
  • 15.Rubio F. Assessment of oral proficiency in online language courses: Beyond reinventing the wheel. Modern Language Journal, 2015; 99(2): 405–408. [Google Scholar]
  • 16.Zhou M. Chinese university students’ acceptance of MOOCs: A self-determination perspective. Computers & Education, 2016; 92: 194–203. [Google Scholar]
  • 17.Zheng C, Liang JC, Yang YF, Tsai CC. The relationship between Chinese university students’ conceptions of language learning and their online self-regulation. System, 2016; 57: 66–78. [Google Scholar]
  • 18.Hampel R, Stickler U. New skills for new classrooms: Training tutors to teach languages online. Computer Assisted Language Learning, 2005; 18(4): 311–326. [Google Scholar]
  • 19.Compton LK. Preparing language teachers to teach language online: A look at skills, roles, and responsibilities. Computer Assisted Language Learning, 2009; 22(1): 73–99. [Google Scholar]
  • 20.Sun SYH. Online language teaching: The pedagogical challenges. Knowledge Management & E-Learning: An International Journal, 2011; 3(3): 428–447. [Google Scholar]
  • 21.Hockly N. Developments in online language learning. Elt Journal, 2015; 69(3): 308–313. [Google Scholar]
  • 22.Garrison DR, Kanuka H. Blended learning: Uncovering its transformative potential in higher education. The Internet and Higher Education, 2004; 7(2): 95–105. [Google Scholar]
  • 23.Helms SA. Blended/hybrid courses: A review of the literature and recommendations for instructional designers and educators. Interactive Learning Environments, 2014; 22(6): 804–810. [Google Scholar]
  • 24.Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 1989; 13(3): 319–340. [Google Scholar]
  • 25.Legris P, Ingham J, Collerette P. Why do people use information technology? Critical review of the technology acceptance model. Information & Management, 2003; 40: 191–204. [Google Scholar]
  • 26.Wu I, Chen J. An extension of trust and TAM model with TPB in the initial adoption of on-line tax: An empirical study. International Journal of Human Computer Studies, 2005; 62(6): 784–808. [Google Scholar]
  • 27.Marzieh R, Abbasian-Naghneh S. E-learning: development of a model to assess the acceptance and readiness of technology among language learners. Computer Assisted Language Learning, 2019; 2: 1–21. [Google Scholar]
  • 28.Alfadda A, Mahdi HS. Measuring students’ use of zoom application in language course based on the technology acceptance model (tam). Journal of Psycholinguistic Research. 2021; 50: 883–900. doi: 10.1007/s10936-020-09752-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Sˇumak B, Hericko M, Pusˇnik M. A meta-analysis of e-learning technology acceptance: The role of user types and e-learning technology types. Computers in Human Behavior, 2011; 27: 2067–2077. [Google Scholar]
  • 30.Warner D, Christie G, Choy S. Readiness of VET clients for flexible delivery including on-line learning. Brisbane: Australian National Training Authority. 1988. [Google Scholar]
  • 31.Moftakhari MM. Evaluating e-learning readiness of faculty of letters of Hacettepe. Ankara: Hacettepe University, 2013. [Google Scholar]
  • 32.Piskurich GM. Preparing learners for e-learning. San Francisco: John Wiley & Sons, 2003. doi: 10.1095/biolreprod.103.017103 [DOI] [Google Scholar]
  • 33.Yilmaz R. Exploring the role of e-learning readiness on student satisfaction and motivation in flipped classroom. Computers in Human Behavior, 2017; 70: 251–260. [Google Scholar]
  • 34.Adnan M. Professional development in the transition to online teaching: The voice of entrant online instructors. ReCALL, 2018; 30(01): 88–111. [Google Scholar]
  • 35.Wei HC, Chou C. Online learning performance and satisfaction: do perceptions and readiness matter?. Distance Education, 2020; 10.1080/01587919.2020.1724768. [DOI] [Google Scholar]
  • 36.Vonderwell S, Savery J. Online learning: Student role and readiness. Turkish Online Journal of Educational Technology-TOJET, 2004; 3(3): 38–42. [Google Scholar]
  • 37.Hashim H, Tasir Z. E-Learning Readiness: A Literature Review. International Conference on Teaching & Learning in Computing & Engineering. IEEE Computer Society. 2014; p. 267–271. [Google Scholar]
  • 38.Dray BJ, Lowenthal PR, Miszkiewicz MJ, Ruiz‐Primo MA, Marczynski K. Developing an instrument to assess student readiness for online learning: A validation study. Distance Education, 2011; 32(1): 29–47. [Google Scholar]
  • 39.Watkins R, Leigh D, Triner D. Assessing readiness for e‐learning. Performance Improvement Quarterly, 2004; 17(4): 66–79. [Google Scholar]
  • 40.Hung ML, Chou C, Chen CH, Own ZY. Learner readiness for online learning: Scale development and student perceptions. Computers & Education, 2010; 55(3): 1080–1090. [Google Scholar]
  • 41.Ilgaz H, Gülbahar Y. A snapshot of online learners: e-Readiness, e-Satisfaction and expectations. International Review of Research in Open and Distributed Learning, 2015: 16(2): 171–187. [Google Scholar]
  • 42.Knowles MS. Self-directed learning: A guide for learners and teachers. New York: Association Press; 1975. [Google Scholar]
  • 43.Shyu HY, Brown SW. Learner control versus program control in interactive videodisc instruction: what are the effects in procedural learning? International Journal of Instructional Media, 1992; 19(2): 85–95. [Google Scholar]
  • 44.Ryan RM, Deci EL. Intrinsic and extrinsic motivations: classic definitions and new directions. Contemporary Educational Psychology, 2000; 25(1): 54–67. doi: 10.1006/ceps.1999.1020 [DOI] [PubMed] [Google Scholar]
  • 45.Hong JC, Hwang MY, Tai KH, Lin PH. Intrinsic motivation of Chinese learning in predicting online learning self-efficacy and flow experience relevant to students’ learning progress. Computer Assisted Language Learning, 2017; 30(6): 552–574. [Google Scholar]
  • 46.Tylor B, David S. Gauge of Readiness for Internet-Based Language Learning: An 800 Pound GORILLa. JALT CALL Journal, 2013; 9(2): 197–217. [Google Scholar]
  • 47.Marzieh R, Salman AN. E-learning: development of a model to assess the acceptance and readiness of technology among language learners, Computer Assisted Language Learning, 2019; 10.1080/09588221.2019.1640255. [DOI] [Google Scholar]
  • 48.Mehran P, Alizadeh M, Koguchi I, Takemura H. Are Japanese digital natives ready for learning English online? a preliminary case study at Osaka University. International Journal of Educational Technology in Higher Education, 2017; 14(1): 8–24. [Google Scholar]
  • 49.Hoppe DW. Addressing faculty readiness for online teaching [Internet]. 2015. [cited 2020 Sep 15]. Available from https://www.d2l.com/wpcontent/uploads/2015/02/Addressing-Faculty-Readiness_BestPracticesPaper_Final.pdf. [Google Scholar]
  • 50.Gay GH. An assessment of online instructor e-learning readiness before, during, and after course delivery. Journal of Computing in Higher Education, 2016; 28(2): 199–220. [Google Scholar]
  • 51.Ross AF, DiSalvo ML. Negotiating displacement, regaining community: The Harvard Language Center’s response to the COVID-19 crisis. Foreign Language Annals. 2020; 53: 317–379. doi: 10.1111/flan.12463 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Moorhouse BL. Adaptations to a face-to-face initial teacher education course “forced” online due to the COVID-19 pandemic. Journal of Education for Teaching: International Research and Pedagogy. 2020; 46: 609–611. [Google Scholar]
  • 53.Moser KM, Wei T, Brenner D. Remote teaching during COVID-19: Implications from a national survey of language educators. System. 2021: 97, 1–15. [Google Scholar]
  • 54.Chung SJ, Choi LJ. The development of sustainable assessment during the COVID-19 pandemic: the case of the English language program in South Korea. Sustainability 2021; 13: 4499–4511. [Google Scholar]
  • 55.Maican MA, Cocoradă E. Online foreign language learning in higher education and its correlates during the covid-19 pandemic. Sustainability, 2021; 13(2): 781–800. [Google Scholar]
  • 56.Lian J, Chai CS, Zheng C, Liang JC. Modelling the relationship between Chinese university students’ authentic language learning and their English self-efficacy during the COVID-19 pandemic. The Asia-Pacific Education Researcher, 2021; 30(3): 217–228. [Google Scholar]
  • 57.Zou M, Kong D, Lee I. Teacher engagement with online formative assessment in efl writing during COVID-19 pandemic: the case of China. The Asia-Pacific Education Researcher. 2021May25; 10.1007/s40299-021-00593-7. [DOI] [Google Scholar]
  • 58.Gao LX, Zhang LJ. Teacher Learning in Difficult Times: Examining Foreign Language Teachers’ Cognitions About Online Teaching to Tide Over COVID-19. Frontiers in Psychology. 2020; 11: 549–653. doi: 10.3389/fpsyg.2020.00549 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.George G, Mallery P. SPSS for windows step by step: A simple guide and reference, 11.0. Boston, MA: Allyn & Bacon; 2003. [Google Scholar]
  • 60.Richards L. Handling qualitative data: A practical guide. Los Angeles: Sage; 2014. [Google Scholar]
  • 61.Wu DG, Li W. 我国高校大规模线上教学的阶段性特征——基于对学生、教师、教务人员问卷调查的实证研究[J]. 华东师范大学学报(教育科学版), 2020; 38(07): 1–30. Chinese. [Google Scholar]
  • 62.Coolican H. Research Methods and Statistics in Psychology. London: Hodder Education Group; 2009. [Google Scholar]
  • 63.Zou C, Jin L, Li P. Face-to-face classes hijacked by COVID-19: what and how HEI instructors want to learn for online teaching. Proceedings of the 2020 International Conference on Information Science and Education (ICISE-IE). 2020 Dec 4–6; Sanya, China: IEEE Inc; 2020. P. 590–594.
  • 64.Holsapple CW, Lee‐Post A. Defining, assessing, and promoting e‐learning success: An information systems perspective. Decision Sciences Journal of Innovative Education, 2006; 4(1): 67–85. [Google Scholar]
  • 65.Sun SYH. Learner perspectives on fully online language learning. Distance education, 2014; 35(1): 18–42. [Google Scholar]
  • 66.Agonács N, Matos JF, Bartalesi-Graf D, O’Steen DN. Are you ready? self-determined learning readiness of language MOOC learners. Education and information technologies, 2019; 25(2): 1161–1179. [Google Scholar]
  • 67.Martin F, Bolliger DU. Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learning, 2018; 22(1): 205–222. [Google Scholar]
  • 68.Yang YF. Engaging students in an online situated language learning environment. Computer Assisted Language Learning, 2011; 24(2): 181–198. [Google Scholar]
  • 69.Wang Y, Chen NS. Criteria for evaluating synchronous learning management system: arguments from the distance language classroom. Computer Assisted Language Learning, 2009; 22(1): 1–18. [Google Scholar]
  • 70.Moore MJ. Three types of interaction. In Harry K., John M., & Keegan D. (Eds.), Distance Education Theory. New York: Routledge; 1993. [Google Scholar]
  • 71.Aelterman N, Vansteenkiste M, Haerens L, Soenens B, Fontaine JR, Reeve J. Toward an integrative and fine-grained insight in motivating and demotivating teaching styles: The merits of a circumplex approach. Journal of Educational Psychology, 2019;111(3): 497–521. [Google Scholar]
  • 72.Goertler S, Bollen M, Gaff J. Students’ readiness for and attitudes toward hybrid FL instruction. CALICO Journal, 2012; 29(2): 297–320. [Google Scholar]
  • 73.Su F, Zou D. Technology-enhanced collaborative language learning: theoretical foundations, technologies, and implications, Computer Assisted Language Learning, 2020; 10.1080/09588221.2020.1831545. [DOI] [Google Scholar]

Decision Letter 0

Di Zou

3 Jun 2021

PONE-D-21-14306

Online College English Education in China against the COVID-19 Pandemic: Student and Teacher Readiness, Challenges and Implications

PLOS ONE

Dear Dr. Jin,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Jul 18 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Di Zou

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please improve statistical reporting and refer to p-values as "p<.001" instead of "p=.000". Our statistical reporting guidelines are available at https://journals.plos.org/plosone/s/submission-guidelines#loc-statistical-reporting

3. PLOS requires an ORCID iD for the corresponding author in Editorial Manager on papers submitted after December 6th, 2016. Please ensure that you have an ORCID iD and that it is validated in Editorial Manager. To do this, go to ‘Update my Information’ (in the upper left-hand corner of the main menu), and click on the Fetch/Validate link next to the ORCID field. This will take you to the ORCID site and allow you to create a new iD or authenticate a pre-existing iD in Editorial Manager. Please see the following video for instructions on linking an ORCID iD to your Editorial Manager account: https://www.youtube.com/watch?v=_xcclfuvtxQ

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: No

Reviewer #4: Partly

Reviewer #5: Partly

Reviewer #6: Yes

Reviewer #7: Partly

Reviewer #8: Partly

Reviewer #9: Partly

Reviewer #10: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: No

Reviewer #2: N/A

Reviewer #3: No

Reviewer #4: Yes

Reviewer #5: N/A

Reviewer #6: Yes

Reviewer #7: No

Reviewer #8: I Don't Know

Reviewer #9: Yes

Reviewer #10: N/A

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: No

Reviewer #4: Yes

Reviewer #5: No

Reviewer #6: Yes

Reviewer #7: No

Reviewer #8: Yes

Reviewer #9: Yes

Reviewer #10: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: No

Reviewer #4: Yes

Reviewer #5: Yes

Reviewer #6: Yes

Reviewer #7: No

Reviewer #8: No

Reviewer #9: Yes

Reviewer #10: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This work describes students’ and teachers’ readiness towards online college English education. The authors have not published the results elsewhere. Although this study has presented the research into the welcome area concerned with online English education given the current circumstances, this study does not meet the required quality standards to be considered for publication.

The statistics and other analyses have mainly focused on students’ and teachers’ readiness towards online English education as an attitudinal factor. Although the investigation of readiness in attending online college English courses is certainly relevant given the growth of interest in this approach due to the COVID-19 pandemic, this study does not appear to have sufficiently contributed to the ongoing discussion and debate around this approach. The more important argument of this area as one of the blended learning approaches is the efficiency in English proficiency as the learning outcomes. The authors can present the literature and analyses concerned with the impacts of readiness on English proficiency in sufficient and well-substantiated detail.

When it comes to the discussions and conclusions, even if the authors have gained the conclusions are supported by the current data, the authors can extend the topic to make the discussions more comprehensive to some extent. For example, the authors can discuss the correlation between readiness and online English learning in terms of motivation, engagement, satisfaction, or cognitive load.

Considering the formats, although this study has met the community standards for data availability and the applicable standards for the ethics of experimentation and research integrity, the authors can pay attention to the latest APA guidelines that can provide valid references to an intelligible and standard fashion.

Thank you very much for the opportunity to review and consider your work.

Reviewer #2: Data acquisition is ambiguous. Questionnaires should be filled out in a relatively strict and specific environment. "By only including universities from which the interview was conducted 231 successfully within time", please specify this "time" duration.

Proofreading should be done.

Reviewer #3: The current study investigated virtual foreign language teaching and learning during the COVID pandemic by surveying students’ and teachers’ readiness. Through the distribution of a large-scale survey, the study collected a sizeable dataset to probe into the topic. The topic itself is valuable because of the utility of online language teaching. However, the study had serious methodological and analytical problems which precluded it from further consideration. I detail my comments below:

1. The theoretical framework section was actually the review of literature. There are a number of related theoretical underpinnings behind virtual teaching (e.g. computer assisted language learning, self-efficacy, teacher education, student motivations, etc). However, the paper lacks an in-depth discussion of how the empirical study was driven by theories.

2. The description of the measurements was far from clear. Sample questions of each dimension should be delineated in more detail.

3. The most critical question lies in the analytical tools. The description of data analysis including coding, reliability/validity checking, CFA and subsequent inferential statistics was insufficient. The methodology and data analysis were short on robust information. For instance, what does N refer to in the Table 3? If N indicates the number of each question type, the authors need to justify how 3 questions are sufficient to generate the meaningful pattern or factor loadings. Also, the CFA results were not properly presented.

4. The mixed-methods approach should have augmented the strengths of arguments. However, from the presentation of results, it remains unclear about how qualitative data supplemented the quantitative findings. How were the qualitative data sorted out and coded?

5. The data interpretation and discussion leads to the “so-what” concern. The researchers disseminated a large-scale survey to teachers and students across the region, which was commendable. Nonetheless, instead of the exploration of the status quo, the study should address how research findings can be converted to real pedagogical practices; and how instruction and learning can benefit from the empirical evidence.

Reviewer #4: According to the title, the authors claimed that it was the online College English Education “in China” that was investigated. However, only higher education institutions in Wuhan were examined. Therefore, I believe that the authors need to consider whether data gained from the survey conducted in Wuhan were representative enough to represent the condition of online English education in the whole nation (which refers to China). This deficiency provides the rationale for my recommendation to be “major revision.”

The first sentence of the abstract “China was the first country to migrate almost all teaching, learning and even assessment online in education of all levels against COVID-19” (Line 12, 13). The authors failed to provide any proof of this claim, neither in the Abstract nor in the main body of this article. I believe that it is important to explain this statement.

Some unidiomatic word usage is noticed “it was not based on nowhere” (Line 37), “even unimagined” (Line 50), and what HEI (Line 40, where this abbreviation appears for the first time) refers to is not known. The author needs more grammatical checks of the entire article.

The second section should be renamed as “Literature Review” rather than “Theoretical Framework,” considering that most of the discussions in this section were related to the results of the previous research. This means that the authors may need to rewrite the whole section to demonstrate their theoretical foundation. Some introductions to online language education in section 2.1 should have been offered in the first section (Introduction).

Reviewer #5: 1. No full name of HEI and SPOC was given.

2. The number of “3.5 Survey dissemination” should be 3.4.

3. The format of the two headings 4.1& 4.2 is inconsistent.

Theoretical Framework:

4. L117 (Line117): Quote format error: names of more than three authors should be cited as “et al.”

5. L126: How is language teaching different from other subjects?

The research gap that “Language learning online is different from online learning of other subjects, but until now, few studies have focused on student readiness specifically for online language learning” is not fully discussed in terms of what are the difference of language online learning and other subjects online learning”

6. What is your research gap? Is “studies on student readiness on online learning” or “student readiness specifically on language online learning”?

7. Previously the author mentioned “Major dimensions included in these scales are student access to/use of technology, online skills and relationships, motivation and interest, self-directed learning, learner control, factors that affect success”, but on page 10, the dimension of your questionnaire are almost the same as previous researches.

8. L76: There are differences between pandemic and epidemic. It should be “COVID-19 pandemic”.

9. L85-89: grammatical mistakes. "It is flexible; can be adaptive; allows for enhanced, individualized, and authentic materials; can take advantage of communicative tasks and multilingual communities; can foster and take advantage of autonomous learning and learner corpora"

10. L103 "Many HELs are developing MOOCs and SPOCs for English education online, but are teachers and students ready for the development? Literature in this domain was relatively scare." Before addressing the research gap, the authors mentioned several literature on online education. However, there are literature related to MOOC for English learning, maybe the authors could add more literature related to online language learning.

Methodology:

11. Why there are only student respondents from first- and second-year grade?

12. The specific procedure in which the researchers used the tool, ATLAS.TI 8, for coding needs elaboration.

13. Please make sure that the sentence “Qualitative data derived from the SRS and TRS were analyzed... as well as establishing...” is grammatically correct.

14. L229: it is not clear that 18 refers to what here?

15. L293-295: there are no statistic supports for the mainstream platform like Tencent Classroom.

Findings:

16. L524-526: the authors mentioned that “perceptions for future online English education, some students think online or blended English courses have the advantage of high efficiency and effectiveness”. But in line479-481, “participants also commented on the relatively low efficiency and undesirable effectiveness of the online English classes compared with F2F ones”. Further analysis of what led to this result is needed.

17. In 4.1 the authors mentioned about college English education before, during and after COVID-19. But how to define “after COVID-19”? The overview of college English education before and during COVID-19 was supported by data, but overview of college English education after COVID-19 was not.

18. L434-436: Why echo with an unpublished paper? 4.3.2 and 4.3.3 are from two open-ended questions. What about the data of interview?

19. Concerning the finding in p22 “Teachers generally had their own choices with the tools by negotiating with students but a common phenomenon was a mixture of different tools”. The author could further discuss more about the phenomenon of mix use of different tools. Or if it was from the previous studies, the authors should give clear citations.

Discussion & implication:

20. L526-529: no discussion for why students’ perception of online courses varies so differently. Some students view online learning has advantages high efficiency and effectiveness. While other student consider online learning has disadvantages: ineffectiveness, low efficiency.

21. Line 578: why the authors said "with the exception of pedagogical readiness...", while in page 23, especially in item, pedagogical readiness got a low score. It would be better to discuss the low score of item13.

22. L599-602 The expresssion seems confusing and could be polished.

Reviewer #6: Overall, the study is well designed and well researched. The author chose online college English education in China during COVID-19 pandemic as the research topic to investigate students’ and teachers’ readiness and challenges through different dimensions. The samples in this study were mainly collected from one province, which might be one of the limitations of the study. As for the methodology adopted in this study, although the author used factor analysis to extract several different dimensions (factors) behind items in questionnaire, the author did not make further explanations about those different factors and most of the statistics were conducted to make comparisons between different variables. I think the author should add some explanations to illustrate those different factors before making further comparisons between different variables.

The paper is generally written in quality academic style but there are several language issues worthy of revision. Please note that page numbers reflect the page number as shown in my pdf reader and do not reflect actual page number of the manuscript. I list several several issues below (though it is not intended to be exhaustive):

P3 HEI ---Higher Education Institutions (HEIs)

Mooc---Massive Online Open Course(Mooc)

P5 utilizes---utilize

“It is flexible; can be adaptive; allows for enhanced, individualized, and authentic materials; can take advantage of communicative tasks and multilingual communities; can foster and take advantage of autonomous learning and learner corpora….”

The author used semicolon after each clause. However, most of the clauses lack subjects. If the author insists on using such semicolon structure, I suggest the author to add subjects to those clauses. Perhaps another choice is to rearrange the structure of the sentence to make it more integrated.

P7 student access ----student’s access

was----is?

“...teachers need different skills than those normally employed by tutors trained to teach languages in a F2F classroom…..”

This sounds confusing. I think the author might consider add “and” between “tutors” and “trained”.

P8 lower level skills as --- lower level skills such as ?

higher level skills as—higher level skills such as ?

P13 “The survey lasted for ten days.” I suggest the author to combine this sentence with the previous paragraph instead of leaving it as a separate paragraph.

P14 “As the SRS and TRS were…” I suggest the author to change “as” into “Since”

P15 “a combine of” ----a combination of

P19 “suburb area”---suburban area

P20 “test were”---test was

P36 “ ....teachers who have taught online and are planning to teach online or blended English courses.” --- The sentence is ungrammatical.

“.... businesses which invest in education technology and want to draw more profits.” --- The sentence is also ungrammatical.

Some other points:

1.Page 4: The author mentioned that online English course is different from online courses in other disciplines. Perhaps the author should provide some explanations to illustrate the reason why online English teaching is different and much more difficult to implement.

2. P6 “Literature in this domain was relatively scarce.” This sounds a little incomplete when the author put it at the end of the paragraph.

3.“Section 4.1 overview of college English education before, during and after COVID-19”

In this part, the author listed the different dimensions derived from the questionnaires. However, it seems that the author didn’t make an explicit explanation about those dimensions listed in table 3.

Reviewer #7: The paper reported a survey to evaluate the readiness for online English education among 2310 non-English majors and 149 teachers in China. The results revealed a relatively low level of readiness among students and teachers. The study also identified the challenges participants faced in online language education. I appreciate the time and efforts the authors have put into this work, but I do have a number of major concerns which are detailed below.

1. Although the title of the manuscript is “Online College English Education in China against the COVID-19 Pandemic”, the data were collected from one single city in China. It’s hard to say that they can represent the entire country.

2. Three research questions were listed in the paper, but their relationship is rather unclear. For example, if participants report a lower level of readiness at certain dimension, it is likely for them to report greater challenges at that dimension. So the first two questions are to a certain extent overlapping. I suggest the authors reconsider the research questions to avoid overlapping, and if there is any change in the research questions, the entire paper needs to be updated accordingly. It is also necessary to explain clearly why the third research question needs to be explored.

3. Although the authors indicate that “All relevant data are within the manuscript and its Supporting Information files” in the Data Availability Statement, I cannot find the data in the manuscript, and it seems no relevant supporting files have been provided. PLOS ONE requires the authors to share their data publicly, but right now I can only find the statistical results which are not the same as the data. Therefore, I strongly recommend the authors to upload their source data.

4. The abstract fails to summarize the key findings from the research. For instance, what are the challenges encountered by the students and teachers?

5. P2 line22-23: “some challenges coherent with their low scores in certain dimensions from the readiness scales”

Since the authors did not mention what the challenges entail, there is no way for readers to understand the meaning of the sentence.

6. P2 line 24-27: “Qualitative data also showed prospects of growing development of online college English education as the majority of respondents reported their willingness and intention to continue learning/teaching English in online or blended courses in the post-pandemic period.”

This conclusion is not convincing. It is oversimplified to conclude that online college English education has potential to grow only on the basis of the fact that the respondents reported their willingness and intention.

7. If the participants were anonymously engaged in the survey, and they were not compensated in any way, how can the authors ensure they were sufficiently motivated to provide genuine answers to the questions? In designing the scales, did the authors include any item to detect lies? Have the quantitative data been filtered in any way before the analysis?

8. P6, line 105: “Literature in this domain was relatively scarce.”

Does this indicate there is no literature on this topic at all, or there are only a small number of them? Please make it clear.

9. P7 line 126: “few studies have focused on student readiness specifically for online language learning.”

But there are indeed a number of studies on student readiness for online language learning (e.g., Tylor &David 2013; Luu & Lian, 2020; Mehran, et al., 2017). I suggest the authors review the prior studies in a more comprehensive way. The relevant studies should be cited. More importantly, the authors should specify what new contribution this study has made to this field, given all the previous studies that have already been conducted. In reviewing the prior studies concerning teacher readiness, the authors also need to point out the inadequacy of these studies had or the issues they failed to address, which make it necessary for the authors to carry out their own study.

10. The following methodological concerns need to be addressed in the study:

What is the purpose of combining qualitative and quantitative approaches in this study?

What methodological framework did the study follow?

11. A semi-structured interview seems to be conducted in this study, but the findings from the interview do not seem to be relevant to the research questions the authors intended to address. The authors need to reconsider the purpose of having a semi-structured interview? If it is necessary to have the interview, the relevant details need to be reported: How was it designed? What procedures were followed? Did the participants answer any question? How were the answers analyzed?

12. The data analysis section is not helpful for readers to understand how the data analysis was made. The authors only introduced the tools for statistical analysis, which is far from enough. They should also report the statistical approaches used in the analysis, how data trimming was performed, the procedures of data analysis, how the interview was transcribed, how they ensured the accuracy of transcription, etc. The statements about research instrument should be moved to the Instrument section.

13. The results section was not written in a logical way. I suggest the authors reorganize it to respond to the three research questions. Section 4.1 does not seem to be relevant to any research question. Also the relation between 4.2.1 and 4.2.2 is not clear.

14. The statistical results were not reported in a standard way. For all the results from t test or ANOVA, degree of freedom and effect size should be reported. For any insignificant results (e.g., line440-441), the authors should report the statistical results with the exact p values. All statistical symbols (sample statistics) that are not Greek letters should be italicized.

15. The quantitative study shows that students were most ready in terms of technology access, which seems to contradict the finding from the qualitative study which indicates that the greatest challenge for students was technical. Does this divergence reflect an inadequacy in the design of the student readiness questionnaire? More explanation is needed.

16. P27 line 489-491: Despite the teachers’ higher scores of pedagogical readiness in the TRS, their main difficulties during the online semester pertained to pedagogical issues…

Please account for the discrepancy.

17. In the Results section, the authors have reported the results concerning the individual differences in students or teachers, but very few of them were summarized and discussed in the Discussion section. The discussion is rather inadequate.

18. The language of the manuscript is below the standard for publication. It contains too many grammatical and other errors, and some sentences are very difficult to understand. It needs to be proofread thoroughly and carefully by a native speaker of English. Below I’ve listed some of the problems, but such problems are almost everywhere:

P5 86-89: “It is flexible; can be adaptive; allows for enhanced, individualized, and authentic materials; can take advantage of communicative tasks and multilingual communities; can foster and take advantage of autonomous learning and learner corpora”

The sentences are fragmented.

P6 line98: in infancy level > at its infancy

P6 line113: either fully or hybrid > either fully or hybridly

P7 line129: Concerning instructors, teaching online learning requires a reconstruction of their…

What does “teaching online learning” mean?

Line550: understanding how was the situation>understanding the situation

Line 613: there are 3 types interaction

Reviewer #8: PONE-D-21-14306 comments

The study investigated the student and teacher readiness of online teaching and learning in the Chinese EFL contexts. Due to the following methodological issues, I may not believe the article is ready for publicaiton in the Journal in its present form.

1. pp.10-11 "The two questionnaires were initially piloted to check clarity of the language used and to ensure the reliability and validity of the two scales in the local context. Improvements were made in light of the comments from pilot respondents and two experts in research methodology. Both scales were statistically reliable and valid with the pilot tests."

-- Could you please describe in more detail what "improvements" or modifications you did on the questionnaires? In addition, what are the results of the statistical analyses for the evaluation of the reliability and validity of the questionnaires in the pilot tests?

2. p.11 "a semi-structured interview was conducted between the researchers and the personnel in charge of college English education from the sampled universities or colleges to get an overall view of the situation before, during and after the pandemic semester for the sake of better understanding of the statistical results of the survey."

-- Could you describe in detail how the interview was administered? In particular, how did you choose the participants of the interview? What was the procedure of the interview?

3. p.14 "Cronbach’s alpha coefficients were calculated to ensure internal consistency and confirmatory factor analysis was carried out to provide evidence for convergent validity."

-- The readers may also be curious of other results of the CFA than those you reported in Table 3. Particularly, what is the overall result/significance of the model?

Reviewer #9: This study investigated the readiness of students and teachers for online college English education in the spring semester of 2020 in Wuhan during Covid-19, and examined the challenges they faced and their perceptions of the future of online English education.

I have some general comments which hopefully the authors can consider when revising the manuscript.

Introduction and theoretical framework:

1. On page 4. The author mentioned that “…lessons can be and should be drawn for future development of online college English education”. It is necessary for the authors to clarify why and how the examination of emergency remote learning that occurred during Covid-19 can hold implications for that in the non-pandemic periods. Remote learning during a pandemic has characteristics that are not present during non-pandemic periods.

2. In the section titled theoretical framework, it seems that the authors did not elucidate clearly or even did not elucidate any theories behind this study. The authors need to consider introducing a theory (theories) supporting this study.

3. The empirical studies reviewed are not those conducted during the period of Covid-19. The authors need to consider focusing the literature review on or most of the reviewed should be from the pandemic period.

4. On page 6. The author mentioned “…but are teachers and students ready for this development (developing MOOCs and SPOCs)?” I think there are some studies exploring student and teacher readiness for MOOCs and SPOCs. Also, there is a difference between readiness for MOOCs and that for emergency remote learning. Authors should focus on reviewing studies that examined remote learning during emergency situations.

Methodology:

1. Authors need to clarify how the questionnaires were translated into Chinese?

2. Authors need to explain how the data were filtered and how invalid data were removed.

3. What is the purpose of conducting confirmatory factor analysis on both pilot and final sample? Typically, we use the pilot sample for exploratory factor analysis and the main sample for confirmatory factor analysis.

Findings:

1. Section 4.1 could be written more concisely. And it can be considered in the methodology section as research background.

2. The results would be more informative if demographic characteristics were in a regression model as predictor variables to assess their predictive power on each indicator of readiness. It seems that prior experiences with online education should be an important factor influencing readiness. Did the authors consider this factor?

3. The authors examined whether there were significant differences in overall readiness by grade and gender, but what about each indicator of readiness?

4. The authors examined differences in readiness for online education across disciplines. However, as I understand it, the authors wanted to examine the readiness of students for online English education. So why is it important to examine readiness for English online classes across disciplinary backgrounds?

5. The authors examined both student and teacher readiness. And what is the connection between them? Why is it important to discuss both the teachers’ and the students’ readiness in one article if the connection between was not explicitly examined?

6. The authors used interviews to examine the challenges students faced. I wondered why the purpose of the interviews was not to examine the factors that influence success and failure in readiness for Covid-19 emergency remote learning. This would make the article more focused.

7. The research question addressed in section 4.3.3 was “Are you willing to learn/teach English …in the future?” How would authors define the future? The future with a pandemic or emergency situations? Or a future without any emergency situations?

Discussion

1. The discussion needs to be deeply integrated with the relevant literature conducted during Covid-19.

2. The pedagogical implications should be stated more specifically and concretely.

Overall, the authors have a large sample size and the findings show the preparedness of students and teachers during the Covid-19 outbreak. But the entire article (literature review, findings, discussion) needs to be more focused, and closely aligned with the context of the study during the pandemic.

Reviewer #10: Recommendation: Revisions Required

The paper investigated 2310 non-English-major students and 149 English teachers from three types of twelve higher education institutions in Wuhan, to evaluate their readiness for online English education, to figure out challenges they encountered, and to draw implications for future online college English education. It has various potentials and I recommend it to be considered for publication, but only after some major revisions. My suggestions are the following:

1.The authors failed to evaluate the context of the nation-wide online teaching in a justifiable way. They claimed that “This online teaching in the face of COVID-19, or exactly, online triage (Gacs et al. 2020), was carried out without need analysis or readiness evaluation from both learning and teaching sides and was different from well-prepared and planned online teaching.” But actually it was not the case: before the epidemic, online teaching had already serve as a positive supplement to offline teaching, and it is available and accessible for students in most part of China, both in rural and urban areas; there are many university websites and apps which offer free online courses, which quite a number of students make use of regularly in their spare time; great efforts have been made by education authorities, especially the Ministry of Education, to analysis and evaluate the situation, to invest considerably in providing online teaching resources, and to mobilize the education-related IT companies to give a helping hand; schools and teachers had tried their best to make preparations before the launch of the online courses. In short, this nationwide online teaching experience was not unprepared and unplanned at all. So, the authors should adjust their wording throughout the whole paper, in order not to make the basis of this research seemingly groundless.

2.Get the paper checked by expert speakers. The sentences are extremely long, which makes reading the text challenging. And occasionally, there are some inappropriate wording or even grammatical mistakes. All these may reduce the readability of this paper to some extent.

3.The teacher sample is relatively small, which makes the analysis and results less convincing.

4.The analysis would be more reasonable and sound, if the subjects of teachers and students were divided into two groups, rural and urban.

5. It would be better to make the review of literature more pertinent to online English teaching. In the present form, this paper focuses too generally on online teaching as a whole. Thus it is advisable for the authors to make a very brief review of general online teaching but to concentrate more on English teaching, just as the authors claim in the paper that online teaching is carried out differently to cater for the need of different subjects or disciplines.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: Yes: Haomin Zhang

Reviewer #4: No

Reviewer #5: Yes: Xiaobin Liu

Reviewer #6: No

Reviewer #7: No

Reviewer #8: No

Reviewer #9: No

Reviewer #10: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachment

Submitted filename: review results.docx

PLoS One. 2021 Oct 1;16(10):e0258137. doi: 10.1371/journal.pone.0258137.r002

Author response to Decision Letter 0


30 Jul 2021

Response to Reviewer #1

Thank you very much for your careful review and valuable suggestions. Our response and improvement are as follows.

1. The statistics and other analyses have mainly focused on students’ and teachers’ readiness towards online English education as an attitudinal factor. Although the investigation of readiness in attending online college English courses is certainly relevant given the growth of interest in this approach due to the COVID-19 pandemic, this study does not appear to have sufficiently contributed to the ongoing discussion and debate around this approach. The more important argument of this area as one of the blended learning approaches is the efficiency in English proficiency as the learning outcomes. The authors can present the literature and analyses concerned with the impacts of readiness on English proficiency in sufficient and well-substantiated detail.

We were aware of the limitations of the research, but we also believed in its contribution to fully online college English teaching in emergencies and blended teaching as common practices, especially in China, where online English teaching has become a commonplace but online English learning readiness is not well researched. Thank you for giving us inspiration for our future research in this area. Based on this study, we are designing a study on the correlation between online English learning readiness and students’ satisfaction and their English proficiency.

2. When it comes to the discussions and conclusions, even if the authors have gained the conclusions are supported by the current data, the authors can extend the topic to make the discussions more comprehensive to some extent. For example, the authors can discuss the correlation between readiness and online English learning in terms of motivation, engagement, satisfaction, or cognitive load.

Thank you for your suggestions. The discussion part has been improved, but for extending the topic, we are sure to take your suggestions into consideration when designing new studies in the future.

3. Considering the formats, although this study has met the community standards for data availability and the applicable standards for the ethics of experimentation and research integrity, the authors can pay attention to the latest APA guidelines that can provide valid references to an intelligible and standard fashion.

The format has been modified to meet the guidelines of PLOS ONE.

Response to Reviewer #2

Thank you very much for your review and valuable suggestions. We have revised the manuscript based on your comments.

1. Data acquisition is ambiguous. Questionnaires should be filled out in a relatively strict and specific environment. “By only including universities from which the interview was conducted 231 successfully within time”, please specify this “time” duration.

The data acquisition process has been clarified. We combined the original “3.3 Participant recruitment” and “3.4 Survey dissemination”, retitled the section as “3.3 Data acquisition”, and changed some wording (e.g. “time” duration mentioned in your comments) to make it concise, specific and understandable.

We appreciate your advice for a strict and specific environment to fill out the questionnaires. However, universities in Wuhan had not been reopened when the survey was carried out from July 24th to August 2nd. Moreover, we wanted to carry out the survey immediately after the online semester was terminated. Therefore, we had no choice but to carry out the survey through online platforms instead of using paper questionnaires in the classroom with a teacher’s presence. We will pay special attention to data acquisition in future research.

2. Proofreading should be done.

Proofreading has been done to polish the language and modify the format.

Response to Reviewer #3

Thank you very much for your review and pointing out the problems concerning methodology and data analysis. We have revised the manuscript based on your comments.

1. The theoretical framework section was actually the review of literature. There are a number of related theoretical underpinnings behind virtual teaching (e.g. computer assisted language learning, self-efficacy, teacher education, student motivations, etc). However, the paper lacks an in-depth discussion of how the empirical study was driven by theories.

Thank you for your reminding. The section was retitled as literature review, and in this part some discussions about TAM and major components of a readiness scale were added to provide theoretical background for the study.

2. The description of the measurements was far from clear. Sample questions of each dimension should be delineated in more detail.

The instrument section was improved. And in the findings section, we have provided the detailed items of the whole scale.

3. The most critical question lies in the analytical tools. The description of data analysis including coding, reliability/validity checking, CFA and subsequent inferential statistics was insufficient. The methodology and data analysis were short on robust information. For instance, what does N refer to in the Table 3? If N indicates the number of each question type, the authors need to justify how 3 questions are sufficient to generate the meaningful pattern or factor loadings. Also, the CFA results were not properly presented.

Thank you for telling us even the smallest detail of inadequacy in our description. We have double checked our analysis and improved the description.

After careful consideration, we decided to exclude details about CFA. We used scales validated in previous studies with pre-determined dimensions, and the focus of the study was not to validate the scales, so we didn’t carry out EFA with the pilot sample. We intended to show details of CFA with the main sample, and the KMO values indicated that the samples were adequate for CFA, but the final results were problematic. That’s why we only included information of AVE and CR in the previous Table 3. However, we still wanted to show the results of the survey, so we changed our way of analysis for a little bit and exclude details about CFA. We are sorry for this limitation, and we will be more conscientious with future studies.

4. The mixed-methods approach should have augmented the strengths of arguments. However, from the presentation of results, it remains unclear about how qualitative data supplemented the quantitative findings. How were the qualitative data sorted out and coded?

The qualitative data help better explain the participants’ readiness scores. Information about how were the qualitative data sorted out and coded were added in the new sections 3.3 data acquisition and 3.4 data analysis.

5. The data interpretation and discussion leads to the “so-what” concern. The researchers disseminated a large-scale survey to teachers and students across the region, which was commendable. Nonetheless, instead of the exploration of the status quo, the study should address how research findings can be converted to real pedagogical practices; and how instruction and learning can benefit from the empirical evidence.

We intended to reveal the status quo through the survey, but we failed to detail the pedagogical implications. We tried to improve the discussion part to explain the implications in the education setting.

Response to Reviewer #4

Thank you very much for your review and constructive suggestions to improve our manuscript. We have revised the manuscript based on your comments.

1. According to the title, the authors claimed that it was the online College English Education “in China” that was investigated. However, only higher education institutions in Wuhan were examined. Therefore, I believe that the authors need to consider whether data gained from the survey conducted in Wuhan were representative enough to represent the condition of online English education in the whole nation (which refers to China).

Initially, we believed the data gained from the survey conducted in Wuhan were representative enough to represent the condition of online English education in China for the following reasons. Firstly, according to the statistics released by the Ministry of Education (MOE), Wuhan (the capital city of Hubei Province) has the largest student population and the second biggest number of HEIs in China. Secondly, college English education in China follows similar patterns guided by the College English Curriculum Requirements released by the MOE.

However, after careful consideration based on your comments, we decided to change “in China” into “in Wuhan”. Our reasons are: 1) Though college students in Wuhan come from all over the country, the majority are from Hubei Province because of the college student recruiting policy; 2) There are differences concerning technological development between different cities.

2. The first sentence of the abstract “China was the first country to migrate almost all teaching, learning and even assessment online in education of all levels against COVID-19” (Line 12, 13). The authors failed to provide any proof of this claim, neither in the Abstract nor in the main body of this article. I believe that it is important to explain this statement.

According to the prevention measures against the spread of COVID-19 on campuses issued by the MOE, all schools were closed. HEIs were asked to organize online teaching and local education authorities were asked to mobilize resources to provide online courses to secondary and primary school students. The first mass outbreak of COVID-19 was in China, so the sentence was written in this way. Thanks for your reminding, we have rewritten the sentence as “China migrated all teaching and learning online in education of all levels against the spread of COVID-19 on campuses (MOE document, 2020)”.

3. Some unidiomatic word usage is noticed “it was not based on nowhere” (Line 37), “even unimagined” (Line 50), and what HEI (Line 40, where this abbreviation appears for the first time) refers to is not known. The author needs more grammatical checks of the entire article.

Thank you for your careful checks. The mistakes mentioned have been corrected and the language of the whole manuscript has been polished.

4. The second section should be renamed as “Literature Review” rather than “Theoretical Framework,” considering that most of the discussions in this section were related to the results of the previous research. This means that the authors may need to rewrite the whole section to demonstrate their theoretical foundation. Some introductions to online language education in section 2.1 should have been offered in the first section (Introduction).

The second section was renamed as “Literature Review”. The Previous 2.1 about online language education in general was moved to the introduction part, and the whole section was rewritten and some relevant information as theoretical foundation (such as TAM, definitions of the components mentioned in the readiness scales) was added.

Response to Reviewer #5

Thank you very much for your careful review and valuable suggestions to improve our manuscript. We have revised the manuscript based on your comments.

1. No full name of HEI and SPOC was given.

The full names of HEI and SPOC were added when they appear for the first time in the manuscript.

2. The number of “3.5 Survey dissemination” should be 3.4.

The numbers of the sections have been rearranged.

3. The format of the two headings 4.1& 4.2 is inconsistent.

The format of the manuscript has been revised to be consistent with the journal.

Theoretical Framework:

4. L117 (Line117): Quote format error: names of more than three authors should be cited as “et al.”

The quote format of the manuscript has been revised to be consistent with the journal.

5. L126: How is language teaching different from other subjects?

The research gap that “Language learning online is different from online learning of other subjects, but until now, few studies have focused on student readiness specifically for online language learning” is not fully discussed in terms of what are the difference of language online learning and other subjects online learning”.

Further explanation was added: Online language learning is different from online learning of other subjects. Unlike other subjects, language is both the means and the ends of online learning. In the learning process, the learners are supposed to listen, speak, read and write in the language they are learning. Therefore, whether the online learning environment can provide opportunities for the learners to use the language and whether the learners feel free to use it online determines the success of the language course. While studies on readiness prevail for general online learning, there seems to be only a few focusing on online language learning.

6. What is your research gap? Is “studies on student readiness on online learning” or “student readiness specifically on language online learning”?

We have revised the literature review part and further reviewed several studies on learner readiness on online language and summarized our research gap: To the best of the researchers’ knowledge, there has been no study conducted to assess Chinese college students’ readiness for online English learning. Fully online teaching for college English was implemented because of the pandemic, and the researchers took this opportunity to carry out the study, in order to have a rough idea about college students’ readiness for online English learning.

7. Previously the author mentioned “Major dimensions included in these scales are student access to/use of technology, online skills and relationships, motivation and interest, self-directed learning, learner control, factors that affect success”, but on page 10, the dimension of your questionnaire are almost the same as previous researches.

Yes, we adapted scales widely used by other studies. The dimensions were kept, only some items were changed to measure readiness for online English learning/ teaching. The focus of this research is to find out how prepared the students/ teachers are for learning/ teaching English online, so we didn’t design a scale from scratch. The creativity lies in the research subjects and the special context. There are few studies focusing on measuring Chinese college students’ and English teachers’ readiness for learning and teaching English online. And the context of COVID-19 was new. However, designing a better scale for measuring readiness for online language learning can be a topic for our future research.

8. L76: There are differences between pandemic and epidemic. It should be “COVID-19 pandemic”.

The wording has been changed.

9. L85-89: grammatical mistakes. “It is flexible; can be adaptive; allows for enhanced, individualized, and authentic materials; can take advantage of communicative tasks and multilingual communities; can foster and take advantage of autonomous learning and learner corpora”.

The sentence has been rewritten as “First of all, it is flexible, adaptive and allows for enhanced, individualized, and authentic materials. Secondly, it can take advantage of communicative tasks and multilingual communities. Lastly, it can also foster and take advantage of autonomous learning and learner corpora.”

10. L103 “Many HELs are developing MOOCs and SPOCs for English education online, but are teachers and students ready for the development? Literature in this domain was relatively scare.” Before addressing the research gap, the authors mentioned several literature on online education. However, there are literature related to MOOC for English learning, maybe the authors could add more literature related to online language learning.

Thank you for your advice and we have rewritten the literature review part to focus specifically on online language learning readiness.

Methodology:

11. Why there are only student respondents from first- and second-year grade?

Because in China College English is a compulsory course for first- and second-year graders. This has been mentioned in the introduction part. Based on your reminding, we also added related information in “3.3 Data acquisition” to avoid misunderstanding.

12. The specific procedure in which the researchers used the tool, ATLAS.TI 8, for coding needs elaboration.

The coding procedures were added. “Qualitative data obtained from open-ended questions were analyzed using topic and analytical coding (Richards, 2014). Strict procedures were followed to ensure coding reliability. Firstly, the answers to the four questions were uploaded to the qualitative research program ATLAS. ti 8 as separate documents. Secondly, two researchers went through the documents to have a rough idea and started coding independently. The open coding and auto-coding functions were combined, but the auto-coding results were checked to avoid inappropriateness. Categories kept emerging through the process of coding. When this initial stage of independent coding finished, the two researchers compared their codes and categories to negotiate a final version. Lastly, categories were further analyzed.”

13. Please make sure that the sentence “Qualitative data derived from the SRS and TRS were analyzed... as well as establishing...” is grammatically correct.

The long sentence has been simplified as “Quantitative data derived from the SRS and TRS were analyzed using SPSS 20.0 and Microsoft Excel. On the one hand, the overall readiness of students and teachers for online college English learning/teaching during the online migration was calculated. On the other hand, the data were checked to see whether there were significant differences between different demographic groups.”

14. L229: it is not clear that 18 refers to what here?

The sentence has been deleted as the description was improved.

15. L293-295: there are no statistic supports for the mainstream platform like Tencent Classroom.

The statistics were added to make the statement convincing: The mainstream ones were Chaoxing learning app and MOOC platform (40.16%), Icourse163 MOOC platform (28.78%), QQ (27.58%), WeChat (26.36%), VooVmeeting (22.16%), Tencent Classroom (21.30%), Dingtalk (18.10%). Teachers generally had their own choices with the tools by negotiating with students but a common phenomenon was a mixture of different tools. Only a minority of teachers (17.65%) managed to stick with only one platform or tool [60].

Findings:

16. L524-526: the authors mentioned that “perceptions for future online English education, some students think online or blended English courses have the advantage of high efficiency and effectiveness”. But in line479-481, “participants also commented on the relatively low efficiency and undesirable effectiveness of the online English classes compared with F2F ones”. Further analysis of what led to this result is needed.

Further analysis was added: “What drew our attention was students’ opposing perceptions towards the efficiency and effectiveness of online English courses. Some students claimed that online English learning had advantages of high efficiency and effectiveness, while others emphasized its low efficiency and ineffectiveness. By looking at the respondents’ scores of readiness and their answers to the open-ended questions, it was found that students who had lower scores of readiness in the self-directed learning, learn control and online communication self-efficacy dimensions generally considered online English courses less efficient and effective. These students usually lack self-discipline and can not regulate their own learning without the teacher’s presence. As a result, they found they were benefiting less from the online English courses. What’s more, students who reported poor internet connection or software and hardware failures also thought online learning less efficient. Contrarily, students who were capable of self-control and used to communicating in English online verbally or in written form viewed high efficiency and effectiveness as advantages of online English learning, because they could have their own pace of learning and have more chances to communicate with others.”

17. In 4.1 the authors mentioned about college English education before, during and after COVID-19. But how to define “after COVID-19”? The overview of college English education before and during COVID-19 was supported by data, but overview of college English education after COVID-19 was not.

The wording was changed (before and during) to avoid misunderstanding because we didn’t carry out follow up interviews to check the real situation in these universities after the pandemic semester was over.

18. L434-436: Why echo with an unpublished paper? 4.3.2 and 4.3.3 are from two open-ended questions. What about the data of interview?

The reason for echoing with that paper was to show teachers concern about student engagement in online classes. The paper was published and was cited in the revised version.

The data of the interview was analyzed in 4.1 to give an overview of college English education before and during COVID-19. Explanation was added at the beginning in section 4.1: From the interviews, an overall view about college English education in the sampled universities before and during the pandemic was gathered. The information was not directly related to the research questions, but it can help to better understanding the statistical information in the subsequent sections.

19. Concerning the finding in p22 “Teachers generally had their own choices with the tools by negotiating with students but a common phenomenon was a mixture of different tools”. The author could further discuss more about the phenomenon of mix use of different tools. Or if it was from the previous studies, the authors should give clear citations.

Citations were given to statistics quoted and further explanations about the phenomenon of mix use of different tools were also added.

Discussion & implication:

20. L526-529: no discussion for why students’ perception of online courses varies so differently. Some students view online learning has advantages high efficiency and effectiveness. While other student consider online learning has disadvantages: ineffectiveness, low efficiency.

Discussion was added: What drew our attention was students’ opposing perceptions towards the efficiency and effectiveness of online English courses. Some students claimed that online English learning had advantages of high efficiency and effectiveness, while others emphasized its low efficiency and ineffectiveness. By looking at the respondents’ scores of readiness and their answers to the open-ended questions, it was found that students who had lower scores of readiness in the self-directed learning, learn control and online communication self-efficacy dimensions generally considered online English courses less efficient and effective. These students usually lack self-discipline and can not regulate their own learning without the teacher’s presence. As a result, they found they were benefiting less from the online English courses. What’s more, students who reported poor internet connection or software and hardware failures also thought online learning less efficient. Contrarily, students who were capable of self-control and used to communicating in English online verbally or in written form viewed high efficiency and effectiveness as advantages of online English learning, because they could have their own pace of learning and have more chances to communicate with others.

21. Line 578: why the authors said “with the exception of pedagogical readiness...”, while in page 23, especially in item, pedagogical readiness got a low score. It would be better to discuss the low score of item13.

Discussion was added to explain the low score of item 13 (which was now item PR6): Teachers were having a difficulty in engaging students online, and it also was proved by their reported challenges (to be discussed in the following section). In order to make the class go smoothly, teachers would generally require students to turn on the camera and microphone only when answering questions. Without seeing each other, the teacher can not use his or her body language and eye contact to encourage students to participate in activities, and the students can not feel involved, either. Also, the students tend to be distracted without the teacher’s presence, thus participating less in class.

22. L599-602 The expression seems confusing and could be polished.

The sentence was polished: Engagement, interaction, and communication can determine the success of an online course and the learner’s learning performance.

Response to Reviewer #6

Thank you for your positive comments on our manuscripts and valuable suggestions to improve it. We have revised the manuscript based on your comments.

1. The samples in this study were mainly collected from one province, which might be one of the limitations of the study.

Initially, we believed the data gained from the survey conducted in Wuhan were representative enough to represent the condition of online English education in China for the following reasons. Firstly, according to the statistics released by the Ministry of Education (MOE), Wuhan (the capital city of Hubei Province) has the largest student population and the second biggest number of HEIs in China. Secondly, college English education in China follows similar patterns guided by the College English Curriculum Requirements released by the MOE.

However, after more careful consideration based on comments from you and other reviewers, we decided to change “in China” into “in Wuhan”. Our reasons are: 1) Though college students in Wuhan come from all over the country, the majority are from Hubei Province because of the college student recruiting policy; 2) There are differences concerning technological development between cities.

2. As for the methodology adopted in this study, although the author used factor analysis to extract several different dimensions (factors) behind items in questionnaire, the author did not make further explanations about those different factors and most of the statistics were conducted to make comparisons between different variables. I think the author should add some explanations to illustrate those different factors before making further comparisons between different variables.

Thank you for pointing out this important limitation of the study. After careful thinking, we decided to delete the part of factor analysis, because the focus of the research is to find out how prepared the students/ teachers are for learning/ teaching English online and the problems they encountered in this special context. The scales used were adapted from previous research. The factors were kept, and only the wording of the items was changed. However, after conducting this research, we found designing a better scale for measuring readiness for online language learning would be a valuable topic for our future research.

3. The paper is generally written in quality academic style but there are several language issues worthy of revision. I list several issues below (though it is not intended to be exhaustive):

Thank you very much for your careful checks. The mistakes listed were corrected and the language of the manuscript was polished.

P3 HEI ---Higher Education Institutions (HEIs) √

Mooc---Massive Online Open Course (Mooc) √

P5 utilizes---utilize √

“It is flexible; can be adaptive; allows for enhanced, individualized, and authentic materials; can take advantage of communicative tasks and multilingual communities; can foster and take advantage of autonomous learning and learner corpora….”

The author used semicolon after each clause. However, most of the clauses lack subjects. If the author insists on using such semicolon structure, I suggest the author to add subjects to those clauses. Perhaps another choice is to rearrange the structure of the sentence to make it more integrated.

The sentence was restructured: First of all, it is flexible, adaptive and allows for enhanced, individualized, and authentic materials. Secondly, it can take advantage of communicative tasks and multilingual communities. Lastly, it can also foster and take advantage of autonomous learning and learner corpora.

P7 student access ----student’s access √

was----is? √

“...teachers need different skills than those normally employed by tutors trained to teach languages in a F2F classroom…..”

This sounds confusing. I think the author might consider add “and” between “tutors” and “trained”.

The sentence was modified: …teachers need different skills than those who are trained to teach languages in a F2F classroom…

P8 lower level skills as --- lower level skills such as ? √

higher level skills as—higher level skills such as ? √

P13 “The survey lasted for ten days.” I suggest the author to combine this sentence with the previous paragraph instead of leaving it as a separate paragraph.

The previous “3.3 Participant recruitment” and “3.4 Survey dissemination” were rewritten as a single section “3.3 Data acquisition”, so the sentence was deleted.

P14 “As the SRS and TRS were…” I suggest the author to change “as” into “Since” √

P15 “a combine of” ----a combination of √

P19 “suburb area”---suburban area √

P20 “test were”---test was √

P36 “ ....teachers who have taught online and are planning to teach online or blended English courses.” --- The sentence is ungrammatical.

The sentence was revised: The second group is teachers who have taught or are planning to teach online or blended English courses.

“.... businesses which invest in education technology and want to draw more profits.” --- The sentence is also ungrammatical.

The sentence was revised: The last group is businesses which invest in education technology and want to draw more profits.

4. Page 4: The author mentioned that online English course is different from online courses in other disciplines. Perhaps the author should provide some explanations to illustrate the reason why online English teaching is different and much more difficult to implement.

Thanks for your reminding and explanations were added: Online language learning is different from online learning of other subjects. Unlike other subjects, language is both the medium of instruction and the subject matter of online learning. In the learning process, the learners are supposed to listen, speak, read and write in the language they are learning. Therefore, whether the online learning environment can provide opportunities for the learners to use the language and whether the learners feel free to use it online determines the success of the language course.

5. P6 “Literature in this domain was relatively scarce.” This sounds a little incomplete when the author put it at the end of the paragraph.

The sentence was deleted as the whole literature review section was rewritten to be more focused and provide some theoretical foundation.

6. Section 4.1 overview of college English education before, during and after COVID-19”

In this part, the author listed the different dimensions derived from the questionnaires. However, it seems that the author didn’t make an explicit explanation about those dimensions listed in table 3.

The data analysis part has been rewritten to make the paper more focused. The previous table 3 was deleted and the dimensions were explained in the literature review part.

Response to Reviewer #7

Thank you very much for your review and valuable suggestions to improve our manuscript. We have revised the manuscript based on your comments.

1. Although the title of the manuscript is “Online College English Education in China against the COVID-19 Pandemic”, the data were collected from one single city in China. It’s hard to say that they can represent the entire country.

Initially, we believed the data gained from the survey conducted in Wuhan were representative enough to represent the condition of online English education in China for the following reasons. Firstly, according to the statistics released by the Ministry of Education (MOE), Wuhan (the capital city of Hubei Province) has the largest student population and the second biggest number of HEIs in China. Secondly, college English education in China follows similar patterns guided by the College English Curriculum Requirements released by the MOE.

However, after more careful consideration based on comments from you and other reviewers, we decided to change “in China” into “in Wuhan”. Our reasons are: 1) Though college students in Wuhan come from all over the country, the majority are from Hubei Province because of the college student recruiting policy; 2) There are differences concerning technological development between cities.

2. Three research questions were listed in the paper, but their relationship is rather unclear. For example, if participants report a lower level of readiness at certain dimension, it is likely for them to report greater challenges at that dimension. So the first two questions are to a certain extent overlapping. I suggest the authors reconsider the research questions to avoid overlapping, and if there is any change in the research questions, the entire paper needs to be updated accordingly. It is also necessary to explain clearly why the third research question needs to be explored.

Thank you for giving us insights about the research questions. There is indeed some overlapping in the first two questions, but we intended to have a rough idea about students’ and teachers’ readiness levels through the scales, and to find out the specific problems they met in this special context. If the problems are specific to the context of COVID-19, students and teachers don’t need to worry. If not, students, teachers and other personnel related should work together to solve the problems in order to facilitate online English learning/ teaching. Also, by analyzing data, we also found inconsistencies between the reported level of readiness and challenges. For the third question, we thought it necessary because the readiness levels and the challenges didn’t necessarily determine the participants willingness to continue learning English online.

The research question part has been revised to make it clearer for readers, but we are sorry that we could not change the research questions. If we change them, it means we need to do the whole research once again, but the context has already changed. We will be more careful about research questions for our future research. Thanks a lot for your suggestions.

3. Although the authors indicate that “All relevant data are within the manuscript and its Supporting Information files” in the Data Availability Statement, I cannot find the data in the manuscript, and it seems no relevant supporting files have been provided. PLOS ONE requires the authors to share their data publicly, but right now I can only find the statistical results which are not the same as the data. Therefore, I strongly recommend the authors to upload their source data.

Our source data files have been uploaded as required.

4. The abstract fails to summarize the key findings from the research. For instance, what are the challenges encountered by the students and teachers?

Thank you for your advice, and the abstract has been rewritten:

A survey of 2310 non-English-major college students and 149 English teachers from three types of twelve higher education institutions in Wuhan was conducted to evaluate their readiness for online English education during the COVID-19 pandemic, to figure out challenges they encountered and to draw implications for future online college English education. Quantitative statistics gathered using two readiness scales adapted from previous studies showed that both cohorts were slightly below the ready level for the unexpected online transition of college English education. The overall level of readiness for students was 3.68 out of a score of 5, and that for teachers was 3.70. Individual differences were explored and reported. An analysis of qualitative results summarized six categories of challenges encountered by the students, i.e. technical challenges, challenges concerning learning process, learning environment, self-control, efficiency and effectiveness, and health concern. Though the students reported the highest level of readiness in technology access, they were most troubled by technical problems during online study. For teachers, among three types of challenges, they were most frustrated by pedagogical ones, especially students’ disengagement in online class. Qualitative data also brought insights for online college English education development. Institutions should take the initiative and continue promoting the development of online college English education, because the majority of respondents reported their willingness and intention to continue learning/teaching English in online or blended courses in the post-pandemic period. Technical barriers should be removed, readiness evaluation and instructor training are also necessary. Language teachers are suggested to pay special attention to students’ engagement and communication in online courses.

5. P2 line22-23: “some challenges coherent with their low scores in certain dimensions from the readiness scales”

Since the authors did not mention what the challenges entail, there is no way for readers to understand the meaning of the sentence.

The abstract was rewritten, and this sentence was deleted.

6. P2 line 24-27: “Qualitative data also showed prospects of growing development of online college English education as the majority of respondents reported their willingness and intention to continue learning/teaching English in online or blended courses in the post-pandemic period.”

This conclusion is not convincing. It is oversimplified to conclude that online college English education has potential to grow only on the basis of the fact that the respondents reported their willingness and intention.

The sentence has been improved: Qualitative data also brought insights for online college English education development. Institutions should take the initiative and continue promoting the development of online college English education, because the majority of respondents reported their willingness and intention to continue learning/teaching English in online or blended courses in the post-pandemic period. Technical barriers should be removed, readiness evaluation and instructor training are also necessary. Language teachers are suggested to pay special attention to students’ engagement and communication in online courses.

7. If the participants were anonymously engaged in the survey, and they were not compensated in any way, how can the authors ensure they were sufficiently motivated to provide genuine answers to the questions? In designing the scales, did the authors include any item to detect lies? Have the quantitative data been filtered in any way before the analysis?

The participants were not compensated because we hadn’t got enough fund. We tried to ensure the genuineness of answers from participants through two measures. Firstly, the invitation was sent by English teachers to their students. Secondly, when sending the invitation, the teachers also sent the following information: the survey was anonymous and voluntary; it’s OK if you don’t want to participate in the survey, but if you are willing to help, please give honest answers to the questions.

It’s shameful that we forgot to include an item to detect lies. We will be conscientious in future surveys.

The qualitative data were filtered before analysis. We have added the process in the data acquisition part: Among 2351 students who completed the student questionnaire, 15 were from universities from which no interview was conducted, and 26 completed the questionnaire in less than 60 seconds (the researchers tried to finish the questionnaire as fast as they can and determined the minimum completion time acceptable should be 60 seconds). These 41 answers were deleted before analysis. For the teacher questionnaire, 151 teachers completed it, and 2 of them in less than 45 seconds (with the same determining method mentioned above), which was deleted as invalid for further analysis. Therefore, a sample of 2310 first/second-year students and 149 college English teachers from 12 HEIs (3 directly under MOE, 6 under Hubei Provincial Department of Education and 3 non-governmental, diverse in disciplinary areas, student enrollment numbers and include both research and teaching ones) was generated.

8. P6, line 105: “Literature in this domain was relatively scarce.”

Does this indicate there is no literature on this topic at all, or there are only a small number of them? Please make it clear.

This sentence was deleted and we have reviewed more studies related to online language learning readiness in the literature review part in order to make a statement of the research gap.

9. P7 line 126: “few studies have focused on student readiness specifically for online language learning.”

But there are indeed a number of studies on student readiness for online language learning (e.g., Tylor &David 2013; Luu & Lian, 2020; Mehran, et al., 2017). I suggest the authors review the prior studies in a more comprehensive way. The relevant studies should be cited. More importantly, the authors should specify what new contribution this study has made to this field, given all the previous studies that have already been conducted. In reviewing the prior studies concerning teacher readiness, the authors also need to point out the inadequacy of these studies had or the issues they failed to address, which make it necessary for the authors to carry out their own study.

Thanks a lot for your recommendation. We have reviewed these studies and some more on online language learning readiness. The two sections “Student readiness for online language learning” and “Teacher readiness for online language teaching” were rewritten according to your suggestions.

10. The following methodological concerns need to be addressed in the study:

What is the purpose of combining qualitative and quantitative approaches in this study?

What methodological framework did the study follow?

The quantitative approach is to obtain the overall level of readiness among students and teachers, while the qualitative method is to probe into the problems of the low scores of readiness. The problems reported by the participants help to better explain their self-determined readiness.

Information about theoretical foundation (such as TAM, definitions of the components mentioned in the readiness scales) was added in section 2 Literature review.

11. A semi-structured interview seems to be conducted in this study, but the findings from the interview do not seem to be relevant to the research questions the authors intended to address. The authors need to reconsider the purpose of having a semi-structured interview? If it is necessary to have the interview, the relevant details need to be reported: How was it designed? What procedures were followed? Did the participants answer any question? How were the answers analyzed?

The semi-structured interview was not directly related to the research questions. It is designed to provide extra information to better explain the results from the questionnaires. Relevant details were added in Section 3.2 Instrument design and 3.3 Data acquisition.

12. The data analysis section is not helpful for readers to understand how the data analysis was made. The authors only introduced the tools for statistical analysis, which is far from enough. They should also report the statistical approaches used in the analysis, how data trimming was performed, the procedures of data analysis, how the interview was transcribed, how they ensured the accuracy of transcription, etc. The statements about research instrument should be moved to the Instrument section.

The instrument section, data acquisition section and data analysis section were rewritten according to your suggestions.

13. The results section was not written in a logical way. I suggest the authors reorganize it to respond to the three research questions. Section 4.1 does not seem to be relevant to any research question. Also the relation between 4.2.1 and 4.2.2 is not clear.

Section 4.1 does not respond to any research question. It was written to summarize the information from the interviews, and some information can provide extra explanation to the statistical findings.

The previous 4.2.1 focus on the overall level of readiness of students, and 4.2.2 compares the differences between different groups of students, for example, students from urban and rural areas.

14. The statistical results were not reported in a standard way. For all the results from t test or ANOVA, degree of freedom and effect size should be reported. For any insignificant results (e.g., line440-441), the authors should report the statistical results with the exact p values. All statistical symbols (sample statistics) that are not Greek letters should be italicized.

The Findings part was greatly improved according to your advice. The format was also improved to be in line with common practice. Thank you for being nice to pointing out every mistake that we had neglected.

15. The quantitative study shows that students were most ready in terms of technology access, which seems to contradict the finding from the qualitative study which indicates that the greatest challenge for students was technical. Does this divergence reflect an inadequacy in the design of the student readiness questionnaire? More explanation is needed.

Explanation on the discrepancy was added in section 5.1: The students participated in this survey belong to the so-called generation of digital natives. In accordance with this, they rated themselves ready in the technology access dimension. The majority of them own or have access to computers, smartphones and the internet. Nevertheless, the stability of internet connection differs from area to area, especially when all the students were learning online. That could explain the discrepancy between students self-determined readiness level in technology access and the frequently reported problems in internet connection.

16. P27 line 489-491: Despite the teachers’ higher scores of pedagogical readiness in the TRS, their main difficulties during the online semester pertained to pedagogical issues…

Please account for the discrepancy.

This sentence was deleted in the improved manuscript, and the discrepancy was explained in 5.1: What need to be discussed were teachers’ relatively high level of pedagogical readiness and the frequently reported pedagogical challenges. Teachers claimed to be positive towards online English teaching and were willing to learn new technology and skills. They also thought themselves capable of designing online English classes. However, during the pandemic semester, their primary concern was the outcome of teaching. In line with their low readiness score in students’ engagement, the biggest challenge they met was how to effectively engage students in online classes. The problem was not only perceived by the teachers, but also reported by the students as bad communication and poor interaction….

17. In the Results section, the authors have reported the results concerning the individual differences in students or teachers, but very few of them were summarized and discussed in the Discussion section. The discussion is rather inadequate.

The Discussion section was rewritten to summarize and explain all the findings. Thank you for reminding us.

18. The language of the manuscript is below the standard for publication. It contains too many grammatical and other errors, and some sentences are very difficult to understand. It needs to be proofread thoroughly and carefully by a native speaker of English. Below I’ve listed some of the problems, but such problems are almost everywhere:

Thank for very much for pointing out the language errors. The manuscript was proofread and the language was polished. Language problems, including those listed, were corrected.

P5 86-89: “It is flexible; can be adaptive; allows for enhanced, individualized, and authentic materials; can take advantage of communicative tasks and multilingual communities; can foster and take advantage of autonomous learning and learner corpora”

The sentences are fragmented.

The sentence was rewritten as: “First of all, it is flexible, adaptive and allows for enhanced, individualized, and authentic materials. Secondly, it can take advantage of communicative tasks and multilingual communities. Lastly, it can also foster and take advantage of autonomous learning and learner corpora.”

P6 line98: in infancy level > at its infancy √

P6 line113: either fully or hybrid > either fully or hybridly √

P7 line129: Concerning instructors, teaching online learning requires a reconstruction of their…

What does “teaching online learning” mean?

The word “learning” was deleted.

Line550: understanding how was the situation>understanding the situation √

Line 613: there are 3 types interaction

> there are 3 types of interaction

Response to Reviewer #8

Thank you very much for your careful review and valuable suggestions. We have revised the manuscript based on your comments.

1. pp.10-11 “The two questionnaires were initially piloted to check clarity of the language used and to ensure the reliability and validity of the two scales in the local context. Improvements were made in light of the comments from pilot respondents and two experts in research methodology. Both scales were statistically reliable and valid with the pilot tests.”

-- Could you please describe in more detail what “improvements” or modifications you did on the questionnaires? In addition, what are the results of the statistical analyses for the evaluation of the reliability and validity of the questionnaires in the pilot tests?

This part was rewritten as: The two questionnaires were initially piloted to check clarity of the language used and to ensure the reliability of the two scales in the local context. Both scales were statistically reliable, with Cronbach’s Alpha coefficients being 0.961 and 0.859 respectively. Improvements were made in light of the comments from pilot respondents and two colleagues with expertise in questionnaire design. Improvements included removing one item which was overlapping with another one from the TRS and clarifying language ambiguities. For instance, one item “当我的电脑软硬件出现技术问题时,有人和/或资源给我提供帮助。” was modified as “在线教学过程中出现技术问题时,有人(同事,家人,平台技术支持)和/或资源(手册,视频)给我提供帮助。”

2. p.11 “a semi-structured interview was conducted between the researchers and the personnel in charge of college English education from the sampled universities or colleges to get an overall view of the situation before, during and after the pandemic semester for the sake of better understanding of the statistical results of the survey.”

-- Could you describe in detail how the interview was administered? In particular, how did you choose the participants of the interview? What was the procedure of the interview?

The participants of the interview were invited by the director of our department. These participants are in charge of the college English education in their universities, so they know the details of the online English teaching during the pandemic semester. Thanks to your advice, the procedure of the interview was added in the manuscript. Relevant information was underlined in the following:

In order to involve universities at all levels, the director of the School of Foreign Languages from the researchers’ university sent invitations to her counterparts from 19 institutions (4 directly under MOE, 10 under Hubei Provincial Department of Education and 5 non-governmental) purposively to recruit teacher and student respondents who teach or learn college English courses. All of them replied and accepted the invitation to help without promising a satisfactory result because participation was voluntary and responses were anonymous.

The questionnaires were disseminated online from July 24th to August 2nd, 2020 after the semester terminated in all universities. Both were posted on one of the most widely-used online survey platforms in China powered by https://www.wjx.cn/ and only those invited by their college English teachers got the access to participate. The survey was set to allow only one submission for the sake of data integrity.

The interviews were conducted during the same time period by the researchers with the personnel in charge of college English education from the sampled universities or colleges on a one-to-one basis through email or online chatting tools. Written answers were copied directly. Oral ones were transcribed automatically by chatting tools first and then checked by one of the researchers before further analysis.

3. p.14 “Cronbach’s alpha coefficients were calculated to ensure internal consistency and confirmatory factor analysis was carried out to provide evidence for convergent validity.”

-- The readers may also be curious of other results of the CFA than those you reported in Table 3. Particularly, what is the overall result/significance of the model?

The corresponding part was rewritten to exclude details about CFA. We used scales validated in previous studies with pre-determined dimensions, and the focus of the study was not to validate the scales, so we didn’t carry out EFA with the pilot sample. We intended to show details of CFA with the main sample, and the KMO values indicated that the samples were adequate for CFA, but the final results were problematic. That’s why we only included information of AVE and CR in previous Table 3. However, we still wanted to show the results of the survey, so we changed our way of analysis for a little bit and exclude details about CFA. We are sorry for this limitation, and we will be more conscientious with future studies.

Response to Reviewer #9

Thank you very much for your careful review and valuable suggestions. We have revised the manuscript based on your comments.

Introduction and theoretical framework:

1. On page 4. The author mentioned that “…lessons can be and should be drawn for future development of online college English education”. It is necessary for the authors to clarify why and how the examination of emergency remote learning that occurred during Covid-19 can hold implications for that in the non-pandemic periods. Remote learning during a pandemic has characteristics that are not present during non-pandemic periods.

Thank you for pointing out the problem. The part has been rewritten: It was different from well-prepared and planned online teaching. Generally speaking, for college English courses in China, online learning is currently acting as a complementary means to classroom teaching. Learning platforms coming along with the textbooks and self-developed MOOCs or Small Private Online Courses (SPOCs) are the mainstream tools for English teachers to implement online or blended teaching. Problems may occur, but at a low frequency and do not cause too many anxieties because there is classroom teaching. However, during the pandemic, online English learning was the only compulsory means rather than a complementary one. Problems occurred frequently, especially at the beginning, and caused anxieties on both the teaching and learning sides. Some problems might be specific to the pandemic context, others might be common ones even in non-pandemic period. Therefore, now that the semester has terminated smoothly and successfully, lessons can be and should be drawn for future development of online college English education. This research aims to draw implications for the development of online college English education through measuring the readiness levels of the students and teachers for the online transition and probing into the problems they met in this particular context.

2. In the section titled theoretical framework, it seems that the authors did not elucidate clearly or even did not elucidate any theories behind this study. The authors need to consider introducing a theory (theories) supporting this study.

The section was renamed as Literature review, and we have added relevant information (such as TAM and definitions of major components of a readiness scale” as the theoretical foundation.

3. The empirical studies reviewed are not those conducted during the period of Covid-19. The authors need to consider focusing the literature review on or most of the reviewed should be from the pandemic period.

We have added a few studies conducted during the pandemic in section 2.4 Online language teaching during COVID-19 pandemic.

4. On page 6. The author mentioned “…but are teachers and students ready for this development (developing MOOCs and SPOCs)?” I think there are some studies exploring student and teacher readiness for MOOCs and SPOCs. Also, there is a difference between readiness for MOOCs and that for emergency remote learning. Authors should focus on reviewing studies that examined remote learning during emergency situations.

We have rewritten the whole literature review part to make it more focus and provide some theoretical foundation. Studies on online language learning during the pandemic were also reviewed in 2.4 Online language teaching during COVID-19 pandemic.

Methodology:

1. Authors need to clarify how the questionnaires were translated into Chinese?

Clarification was added in the instrument design part: Translation of the questionnaires were done by one of the researchers who has a master’s degree in translation, and double checked by a colleague with a master’s degree in translation.

2. Authors need to explain how the data were filtered and how invalid data were removed.

Clearer explanation on how data were filtered was added: Throughout the survey, personnel in charge of college English education from 16 universities participated in the interview. Among them, 4 were excluded because researchers received no teacher response or less than 10 student responses from these universities. Among the 2351 students who completed the student questionnaire, 15 were from universities from which no interview was conducted, and 26 completed the questionnaire in less than 60 seconds (the researchers tried to finish the questionnaire as fast as they can and determined the minimum completion time acceptable should be 60 seconds). These 41 answers were deleted before analysis. For the teacher questionnaire, 151 teachers completed it, and 2 of them in less than 45 seconds (with the same determining method mentioned above), which was deleted as invalid for further analysis. Therefore, a sample of 2310 first/second-year students and 149 college English teachers and 2310 first/second-year students from 12 HEIs (3 directly under MOE, 6 under Hubei Provincial Department of Education and 3 non-governmental, diverse in disciplinary areas, student enrollment numbers and include both research and teaching ones) was generated.

3. What is the purpose of conducting confirmatory factor analysis on both pilot and final sample? Typically, we use the pilot sample for exploratory factor analysis and the main sample for confirmatory factor analysis.

The corresponding part was rewritten to exclude details about CFA. We used scales validated in previous studies with pre-determined dimensions, and the focus of the study was not to validate the scales, so we didn’t carry out EFA with the pilot sample. We intended to show details of CFA with the main sample, and the KMO values indicated that the samples were adequate for CFA, but the final results were problematic. That’s why we only included information of AVE and CR in Table 3. However, we still wanted to show the results of the survey, so we changed our way of analysis for a little bit and exclude details about CFA. We are sorry for this limitation, and we will be more conscientious with future studies.

Findings:

1. Section 4.1 could be written more concisely. And it can be considered in the methodology section as research background.

It’s a nice suggestion to put 4.1 in the methodology section as research background. We tried, but failed. 4.1 was a summary of the interviews, it actually belongs to results. We tried to delete the interviews, but we were afraid that readers would doubt how we obtained the information. Therefore, we kept it unchanged, but deleted some information.

2. The results would be more informative if demographic characteristics were in a regression model as predictor variables to assess their predictive power on each indicator of readiness. It seems that prior experiences with online education should be an important factor influencing readiness. Did the authors consider this factor?

We appreciated your advice about the regression model, but we prefer not to further complicate the research in this paper any more. And thank you for your reminding about prior experiences as a factor. Upon your suggestion we added relevant information in the discussion part: The study failed to consider previous experience with online education as a variable, but students coming from different types of institutions were found to have different levels of readiness. The reason might be the fact that the three sampled non-governmental universities relied fully on traditional teaching for college English education while the others had already implemented blended teaching or online learning prior to the pandemic.

3. The authors examined whether there were significant differences in overall readiness by grade and gender, but what about each indicator of readiness?

We have checked, and there was no statistically significant difference by grade and gender for each indicator of readiness. We didn’t report it because we don’t want to make it too complicated. If there were differences, we would have reported.

4. The authors examined differences in readiness for online education across disciplines. However, as I understand it, the authors wanted to examine the readiness of students for online English education. So why is it important to examine readiness for English online classes across disciplinary backgrounds?

In China, all college students, except those majoring in English, have to study college English. According to the researchers teaching experience, students with different disciplinary backgrounds have different levels of English language efficacy and different attitudes towards English learning. That’s why individual differences were examined across disciplinary backgrounds.

5. The authors examined both student and teacher readiness. And what is the connection between them? Why is it important to discuss both the teachers’ and the students’ readiness in one article if the connection between was not explicitly examined?

Thank you for your doubt. To be honest, we regret for the decision to discuss both, which made the whole research much more complicated than we thought. However, we had come so far, so we didn’t want to quit. Our justification was that the teacher and the students acting together determine the success of an online course, so readiness levels from both cohorts were important. It is a possible topic to explore the connection between the teachers’ and the students’ readiness for future research.

6. The authors used interviews to examine the challenges students faced. I wondered why the purpose of the interviews was not to examine the factors that influence success and failure in readiness for Covid-19 emergency remote learning. This would make the article more focused.

We thought that success and failure would be better determined by the learning outcomes, such as a test, so we made it more objective to ask respondents to report the challenges. However, we are going to discuss about it for possible future studies.

7. The research question addressed in section 4.3.3 was “Are you willing to learn/teach English …in the future?” How would authors define the future? The future with a pandemic or emergency situations? Or a future without any emergency situations?

This is our fault. We didn’t specify “the future”, and we didn’t realize the problem until the we’ve got the final results, because a number of respondents mentioned willingness to have online classes in emergencies but still prefer traditional classes in normal situations. For the pilot tests, we only got a sample of 37 teachers and 104 students, so the results didn’t show the problem. Thank you for pointing out this problem, and we have added this information as a limitation of the study. We are sure to be more cautious for future instrument design.

Discussion

1. The discussion needs to be deeply integrated with the relevant literature conducted during Covid-19.

Thank you for the suggestion, and we have integrated a few literatures conducted during the pandemic.

2. The pedagogical implications should be stated more specifically and concretely.

The pedagogical implications were stated more specifically in the new section 5.2 Implications for educational settings.

Response to Reviewer #10

Thank you very much for your careful review and valuable suggestions. We have revised the manuscript based on your comments.

1.The authors failed to evaluate the context of the nation-wide online teaching in a justifiable way. They claimed that “This online teaching in the face of COVID-19, or exactly, online triage (Gacs et al. 2020), was carried out without need analysis or readiness evaluation from both learning and teaching sides and was different from well-prepared and planned online teaching.” But actually it was not the case: before the epidemic, online teaching had already serve as a positive supplement to offline teaching, and it is available and accessible for students in most part of China, both in rural and urban areas; there are many university websites and apps which offer free online courses, which quite a number of students make use of regularly in their spare time; great efforts have been made by education authorities, especially the Ministry of Education, to analysis and evaluate the situation, to invest considerably in providing online teaching resources, and to mobilize the education-related IT companies to give a helping hand; schools and teachers had tried their best to make preparations before the launch of the online courses. In short, this nationwide online teaching experience was not unprepared and unplanned at all. So, the authors should adjust their wording throughout the whole paper, in order not to make the basis of this research seemingly groundless.

Thank you for pointing out the problem. The advancement in online education in China, and the efforts by MOE and schools were mentioned in the introduction part, but a careful check throughout the paper still showed a lot of inappropriate diction, as you said. Inappropriate wording has been adjusted to make the paper more objective.

2. Get the paper checked by expert speakers. The sentences are extremely long, which makes reading the text challenging. And occasionally, there are some inappropriate wording or even grammatical mistakes. All these may reduce the readability of this paper to some extent.

The language of the whole paper was polished.

3.The teacher sample is relatively small, which makes the analysis and results less convincing.

Yes, the teacher sample was small because participation was voluntary and we wanted to include participants from as many institutions as possible. We asked the directors of the College English Departments to send the invitation for only one time, instead of urging the teachers to participate. For future research, we will give compensation to involve more participants.

4.The analysis would be more reasonable and sound, if the subjects of teachers and students were divided into two groups, rural and urban.

The subjects of teachers were not divided into rural and urban groups, because all these institutions are located in cities. In Table 4, we analyzed the difference of readiness levels between students from rural and urban areas.

5. It would be better to make the review of literature more pertinent to online English teaching. In the present form, this paper focuses too generally on online teaching as a whole. Thus it is advisable for the authors to make a very brief review of general online teaching but to concentrate more on English teaching, just as the authors claim in the paper that online teaching is carried out differently to cater for the need of different subjects or disciplines.

Thank you for your advice. The literature review section was rewritten to make it more pertinent to online language education, especially readiness for online language learning and teaching.

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 1

Di Zou

10 Sep 2021

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: (No Response)

Reviewer #5: (No Response)

Reviewer #6: All comments have been addressed

Reviewer #8: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #5: Partly

Reviewer #6: Yes

Reviewer #8: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #5: N/A

Reviewer #6: Yes

Reviewer #8: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #5: Yes

Reviewer #6: Yes

Reviewer #8: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #5: Yes

Reviewer #6: Yes

Reviewer #8: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for allowing me to review your manuscript again. This time the authors have exerted many efforts to enhance their manuscript. Indeed, the manuscript in this version is better than that in the previous version. Since the manuscript has presented major revisions in terms of the overall arrangement, the authors can also make further improvements to make the manuscript clearer and more persuasive.

First, the authors can further emphasize the central part of “readiness”. Since this concept is the keyword, the authors can highlight the significance of “readiness” in online EFL.

Besides, the authors can further stress their contribution to technological development and pedagogical practices to demonstrate the practicality of this manuscript. The potential readers can also grasp the persuasive information concerned with the target field.

At last, to guarantee a more explicit manuscript structure, the authors can answer the research questions or hypotheses in the Conclusion section. Thus, this manuscript can tell the readers what have been done to investigate particular questions.

Based on the previous evaluation, the authors can provide a minor revision to make a more excellent and persuasive manuscript.

Reviewer #5: The author did improve the quality and the readability, however, I'm still not confident to recommend this manuscript to be published in PLOS ONE, the most imporant reason is that it's no longer the right time to publish articles related to people's readiness and challenges during the COVID-19, since it's already the post-pandemic era now.

Reviewer #6: In the abstract, the sentence "Technical barriers should be removed, readiness evaluation and instructor training are also necessary." seems ungrammatical. It needs to be further polished. In addition, the abstract starts with the method part, which is OK. But I think it is better to give some background introduction at the very beginning of the abstract to illustrate the research domain.

Reviewer #8: PONE-D-21-14306_R1

The authors have well addressed the concerns of the reviewers, and I very much appreciate what the authors have done in the revision.

With that said, I would suggest the authors closely read the manuscript again, although the language has very much improved compared with that of the last version. One example lies in Line 55, p.4, where "at a short notice" seemingly should be "at short notice" (see https://www.collinsdictionary.com/dictionary/english/at-short-notice and https://www.ldoceonline.com/dictionary/at-short-notice).

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Yu Zhonggen

Reviewer #5: No

Reviewer #6: No

Reviewer #8: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Oct 1;16(10):e0258137. doi: 10.1371/journal.pone.0258137.r004

Author response to Decision Letter 1


13 Sep 2021

Response to Reviewer #1

Thank you for allowing me to review your manuscript again. This time the authors have exerted many efforts to enhance their manuscript. Indeed, the manuscript in this version is better than that in the previous version. Since the manuscript has presented major revisions in terms of the overall arrangement, the authors can also make further improvements to make the manuscript clearer and more persuasive.

Thank you very much for reviewing our manuscript again and recognition of our efforts in improving the manuscript. The improvements couldn’t have been made without your suggestions and those from other reviewers. We are grateful for your further suggestions and the problems are addressed as follows.

First, the authors can further emphasize the central part of “readiness”. Since this concept is the keyword, the authors can highlight the significance of “readiness” in online EFL.

Relevant studies were summarized to highlight the significance of “readiness” in online EFL in Section 2.2:

Several studies concluded the significance of e-readiness in online learning from different perspectives. Moftakhari [32] claimed the success of online learning entirely relied on learners’ and teachers’ readiness levels, which seems to be absolute. Piskurich [33] believed low readiness level was the main reason for failure in online learning. Students’ e-learning readiness was proved statistically as a significant predictor of their satisfaction for online instruction [34-36]. Therefore, assessing student readiness for online learning is highly relevant prior to delivering a course online either fully or hybridly and promoting student readiness is essential for successful online learning experiences [37].

Besides, the authors can further stress their contribution to technological development and pedagogical practices to demonstrate the practicality of this manuscript. The potential readers can also grasp the persuasive information concerned with the target field.

Implications for teaching staff, institutions and IT companies were listed in Section 5.2. Upon your advice, we checked the implications to see whether they corresponded with the findings and made some improvements.

At last, to guarantee a more explicit manuscript structure, the authors can answer the research questions or hypotheses in the Conclusion section. Thus, this manuscript can tell the readers what have been done to investigate particular questions.

The conclusion part was rewritten to answer the research questions briefly in order to achieve a more explicit manuscript. Detailed explanations were included in Sections 4.2-4.4 and Section 5.1, so the conclusion part was kept concise.

Response to Reviewer #5

The author did improve the quality and the readability, however, I'm still not confident to recommend this manuscript to be published in PLOS ONE, the most important reason is that it's no longer the right time to publish articles related to people's readiness and challenges during the COVID-19, since it's already the post-pandemic era now.

Thank you for your favorable comments on our revised version. We were aware of the timeliness of the submission, but it took us much longer time to analyze the data and write the draft than expected. Unfortunately, the pandemic still persists in many parts of the world, though it’s already the post-pandemic era and people have been trying to live with the virus. Here in China, classes were moved online occasionally in several cities when local cases were confirmed, and many Chinese students who were enrolled in international programs were having remote classes due to restrictions on international travel. In addition, though the research was conducted during the pandemic and the analysis focused on people’s readiness and challenges during that period, the results can provide some implications for online English teaching in general as indicated in the manuscript. Therefore, we sincerely hope to get an opportunity to publish the research. Moreover, thanks for your inspiration, we may explore more issues in online language teaching in the post-pandemic era.

Response to Reviewer #6

1. In the abstract, the sentence "Technical barriers should be removed, readiness evaluation and instructor training are also necessary." seems ungrammatical. It needs to be further polished.

Thank you for carefully reviewing our manuscript again. The sentence was ambiguous and it was rewritten as: They are supposed to remove technical barriers for teachers and students, and assess the readiness levels of both cohorts before launching English courses online. Institutions should also arrange proper training for instructors involved, especially about pedagogical issues.

The language of the manuscript was polished again upon suggestions from you and another reviewer.

2. In addition, the abstract starts with the method part, which is OK. But I think it is better to give some background introduction at the very beginning of the abstract to illustrate the research domain.

Thanks for your advice and we take it. The following sentence was added at the beginning of the abstract:

Online education, including college English education, has been developing rapidly in the recent decade in China. Such aspects as e-readiness, benefits and challenges of online education were well-researched under normal situations, but fully online language teaching on a large-scale in emergencies may tell a different story.

Response to Reviewer #8

The authors have well addressed the concerns of the reviewers, and I very much appreciate what the authors have done in the revision.

With that said, I would suggest the authors closely read the manuscript again, although the language has very much improved compared with that of the last version. One example lies in Line 55, p.4, where "at a short notice" seemingly should be "at short notice" (see https://www.collinsdictionary.com/dictionary/english/at-short-notice and https://www.ldoceonline.com/dictionary/at-short-notice).

Thank you for reviewing our manuscript again and giving us positive comments on our last revision. We are so grateful because the manuscript couldn’t have been improved without constructive suggestions from you and other reviewers.

We feel ashamed of our silly mistakes (like the one you pointed out), and the language was polished again. Hopefully it has been improved. We are truly grateful for and inspired by your meticulousness.  

Response to Journal Requirements

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

The reference list was checked again and no retracted paper was cited. However, when revising the manuscript, two new references were added and the order of several references was changed. The reference list of this revised version was complete and correct.

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 2

Di Zou

20 Sep 2021

Online College English Education in Wuhan against the COVID-19 Pandemic: Student and Teacher Readiness, Challenges and Implications

PONE-D-21-14306R2

Dear Dr. Li,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Di Zou

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Di Zou

24 Sep 2021

PONE-D-21-14306R2

Online College English Education in Wuhan against the COVID-19 Pandemic: Student and Teacher Readiness, Challenges and Implications

Dear Dr. Jin:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Di Zou

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Table. Demographics for student respondents.

    (TIF)

    S2 Table. Demographics for teacher respondents.

    (TIF)

    S3 Table. Means (M) and Standard Deviations (SD) of the SRS (N = 2310).

    (TIF)

    S4 Table. Independent samples t-test of area.

    (TIF)

    S5 Table. Means (M) and Standard Deviations (SD) of the TRS (N = 149).

    (TIF)

    S6 Table. Categories of challenges reported by students (codes and frequencies).

    (TIF)

    S7 Table. Categories of challenges reported by teachers (codes and frequencies).

    (TIF)

    S1 Fig. Overall readiness (M) of students from different disciplinary areas.

    (TIF)

    S1 File. Questionnaire and interview results.

    (ZIP)

    Attachment

    Submitted filename: review results.docx

    Attachment

    Submitted filename: Response to Reviewers.docx

    Attachment

    Submitted filename: Response to Reviewers.docx

    Data Availability Statement

    All relevant data are within the manuscript and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES