Abstract
The purpose of this study was to investigate the effectiveness of web-based biology learning environment in improving academic performance via a meta-analysis. In looking for the studies on web-based biology learning environment, several keyword patterns from the abstracts (e.g., Pattern 1: web-based learning and biology education) were recruited in well-known databases (e.g., ERIC, EBSCO, Springer Link). Finally, 22 papers were apparent for the current meta-analysis examining the effect of web-based biology learning environment on academic performance. All statistical data from the studies were initially inserted into an Excel sheet and then imported into comprehensive meta-analysis (CMA) statistics software to calculate Hedges’ g values. The overall effect-size of web-based biology learning environment pointed to a medium effect. Also, it was found that the educational level and type of experimental design, as moderator variables, did not positively affect the students’ academic performance along with web-based biology learning environment. In light of the findings, it can be concluded that web-based biology learning environment is somewhat effective at improving the students’ academic performance. The current study recommends that further studies should be undertaken to deepen the implementation processes of the studies with extreme values and explore what makes them unique.
Keywords: Academic performance, Biology education, Educational level, Meta-analysis, Web-based learning environment
Introduction
Web-based learning environment, as a hypermedia centred instructional programme, enables researchers to build a meaningful learning environment by means of the attributes and resources of the World Wide Web (e.g., immediately updating, disseminating, and sharing data) (Khan, 1997; Rosenberg, 2001). Hence, it not only proposes to enhance and support student learning but also offers an alternative pedagogy for science learning (Khan, 1997; Morento et al., 2014). Further, it gives an opportunity for students to control information, access other relevant resources, communicate with peers, and upload any related information to the database if necessary (Castro & Tumibay, 2021). Moreover, it establishes synchronous and/or asynchronous communication (e.g., e-mail, file transfer, discussion and news groups) between students and teachers through interactive web pages, and internet services such as chat rooms (Tüysüs & Aydın, 2007; Bell & Fedeman, 2013). Also, it helps teachers record and save the data about students’ learning processes and outcomes. Because web-based learning, online learning, virtual laboratory, and e-learning generally use internet or webpages/sites (Mioduser et al., 2000; Santos & Prudente, 2022), the current paper prefers calling them web-based learning environment (WBLE) that facilitates student learning.
The WBLE also contains several challenges that need to be well-planned and handled before its usage. In view of Torum (2003), three main elements (infrastructure or logistics, educational and economic) directly influence the effectiveness of the WBLE or its achievement degree. That is, it requires a solid infrastructure system such as computer hardware and internet connection. Also, since it requests students to have some skills (e.g., the use of computer and internet) and affective features (e.g., self-regulated learning, self-confidence, and responsibility), its efficiency may be limited at some grades or educational levels. Further, because it needs such technological tools as computer, internet connection, software, and hardware, its technological cost increases economic burden in education. For this reason, its pros and cons should be carefully considered to cultivate its fruitfulness. In other words, a good SWOT (strengths, weaknesses, opportunities, and threats) analysis is needed prior to integrating the WBLE into science learning. Indeed, a meta-analysis of the WBLE may serve to illuminate such a SWOT analysis.
The Use of the WBLE in Biology
Because biology contains abstract and complex concepts or relationships (Bayrak et al., 2021; Bennett, 2003; Kumandaş et al., 2019; Schmid & Telaro, 1990), students find it difficult to learn. Indeed, such learning difficulties (generally called misconceptions or alternative conceptions) undermine their advanced learning of biology and interconnected concepts (e.g., sustainable development, biochemical cycles, and green pesticides) as well as interest in biology or science (Bahar et al., 1999; Baram-Tsabari et al., 2010; Byukusenge et al., 2022; Candaş & Çalık, 2022; Kumandaş et al., 2019). Furthermore, some factors (e.g., a lack of laboratory or experimental facilities, biology teachers’ low self-confidence and competencies of biology laboratory and related activities) mostly direct teachers to use traditional instruction (e.g., chalk and talk) that gives limited opportunities for students to experience biology and engage in biology learning. Given the foregoing issues, the WBLE is a promising alternative way to achieve a better biology learning. For example, it affords students to visualize the structures and functions of biological systems and interactions between them. Thus, it bridges students’ understanding of biology to the daily-life issues. Moreover, since it enriches learning environment using technological tools such as audio, video, graphics, animations, and simulations, it not only increases students’ conceptual understanding and academic achievement but also makes biology course funny and enjoyable (Conejo et al., 2016; Incantalupo et al., 2014; Wesolowski, 2008). For example, some biology educators have used the WBLE to facilitate students’ biology learning (e.g., Dean, 2004; Gibbins et al., 2003; Hill, 2013; Kopec, 2002; Musawi et al., 2021) and enhance their academic performance (prioritizing achievement and understanding) (e.g., Ertan, 2019; Gardner & Belland, 2017; Gelbart & Yarden, 2006; Nugraini et al., 2013; Son et al., 2016).
Previous studies have also emphasized that the WBLE should be well-designed to intertwine web-based technologies with biology learning or biology classes. Of course, this appears an important question ‘How effective is this integration?’ Therefore, this question calls for a comprehensive meta-analysis to portray the effectiveness of the WBLE on academic performance. Fortunately, there have been some researchers, who have undertaken the task of reviewing and synthesizing the WBLE studies. The subsequent section illuminates these earlier efforts.
Previous Meta-analysis Studies on the WBLE
The related literature has included several meta-analysis studies focusing on the effectiveness of the WBLE on academic performance. For example, Orhan and Durak-Men (2018) examined the effects of web-based teaching on students’ achievement and attitudes towards science course by handling 32 Turkish studies published from 2007 to 2017. They found a relatively large effect for achievement and a moderate effect for attitudes towards science course. Santos and Prudente (2022), who included 15 studies (7 studies for chemistry, 5 studies for physics, 2 studies for biology, and one study for earth science) from 2015 to 2020, reported that the use of the virtual laboratory activities had a medium effect. Sitzmann et al. (2006), who contained 96 papers concentrating on college students or employees, stated that web-based instruction was more effective at some circumstances (e.g., providing control in long-term courses, practicing the training material and receiving feedback during training) than classroom instruction for teaching declarative knowledge. Ulum (2022) analysed 27 studies from different disciplines (art, biology, English, math, nature, science and social studies), which were conducted with middle school students (grades 4–8). He denoted that the overall effect-size of online learning environment fell into a medium effect for academic achievement and moderator variables (country, school subject, grade, and online education approach) were non-significant. Similarly, Ergül-Sönmez and Çakır (2021), who handled 19 papers from various disciplines (English, natural sciences, statistics, basketball, history, biostatistics and computer) to examine the effect of web 2.0 technologies on academic performance, found a moderate effect.
Moreover, there have been three meta-analysis studies on health sciences to examine the effectiveness of online (e-learning) or web-based learning environment on different learning outcomes. For instance, Richmond et al. (2017), who included 14 studies from 2000 to 2015, investigated the effectiveness of online and alternative clinical intervention methods on knowledge acquisition, practical skills, clinical behaviour, self-efficacy and satisfaction. They found that there was no difference between online training and an interactive workshop for clinical behaviour, and between online methods, and between the workshop and lecture for knowledge. Likewise, Kang and Seamun (2018), who handled 11 studies from 2000 to 2016, examined whether the use of web-based nursing educational programmes increases the participants’ knowledge and clinical performance. They reported statistically significant differences for the overall effect, blended programmes and short-term interventions (2 weeks or 4 weeks). Pei and Wu (2019), who included 16 studies from 2000 to 2017, found a statistically significant difference between online and offline learning for knowledge and skill outcomes.
To examine how the WBLE influences different learning outcomes, the foregoing meta-analysis studies have involved such disciplines as health/medical education (Richmond et al., 2017; Kang & Seamun, 2018; Pei & Wu, 2019; Regmi & Jones, 2021), work-integrated education (Sitzmann et al., 2006), science education (Orhan & Durak-Men, 2018; Santos & Prudente, 2022), English, natural sciences, and so forth (Ergül-Sönmez & Çakır, 2021; Ulum, 2022). Even though some of the meta-analysis studies included few papers on biology education (e.g., Santos & Prudente, 2022; Ulum, 2022), none of them has explicitly focused on how web-based biology learning environment affects students’ academic performance. Such an explored area calls for the present meta-analysis.
Rationale and Significance of the Study
The Covid-19 pandemic has enforced educators, researchers, and policy makers to re-think about alternative learning ways, e.g., web-based learning, online learning, e-learning and virtual laboratory (Adi et al., 2021; Bojović et al., 2020; Ray & Srivastava, 2020; Reeves & Crippen, 2021; Rodríguez et al., 2021). Indeed, any decision about these alternative ways needs plausible and updated evidence. For instance, a meta-analysis including recently published studies may provide invaluable evidence and insights for them. Interestingly, none of previous meta-analysis studies has concentrated on how the use of web-based biology learning environment affects students’ academic performance. For this reason, the current study intends to fill an important gap in the relevant literature. Further, since previous meta-analysis studies reported inconsistent results about the effectiveness of the WBLE in improving students’ academic performance, further meta-analysis studies are needed to overcome this inconsistency. Therefore, the current study purposes to examine whether a specialized meta-analysis (exploring the effect of web-based biology learning environment on academic performance in terms of overall effect and moderator variables– educational level and type of experimental design) supports or refutes the results of previous ones.
Through such a meta-analysis, the current study could provide a robust comprehensive picture of web-based biology learning environment and its effectiveness. Given the recency criterion for the related literature from 2002 to 2022 (e.g., Çalık & Wiyarsi, 2021), it also provides a broader sense of the intervention studies on web-based biology learning environment and challenges limitations of previous meta-analysis studies. For example, Orhan and Durak-Men (2018) only included Turkish papers focusing on web-based teaching from 2007 to 2017. Moreover, given extreme values of the intervention studies on web-based biology learning environment, researchers may elaborate them in their future research. Finally, the present study may shed more light on future decisions and discussions regarding the effectiveness of web-based biology learning environment.
The Purpose and Research Questions of the Study
The purpose of the study was to investigate the effectiveness of web-based biology learning environment in improving academic performance via a meta-analysis. The following research questions guided the current study:
How does web-based biology learning environment influence students’ academic performance?
Do the moderator variables (e.g., educational level and type of experimental design) positively affect students’ academic performance regarding web-based biology learning environment?
Methodology
Through a meta-analysis, this study examined the effect of web-based biology learning environment on students’ academic performance. Hence, it gathered the findings of the studies on web-based biology learning environment and compared them via a statistical analysis (e.g., Karadağ, 2020; Üstün & Eryılmaz, 2014). Also, the current study tried to explore practical significance of web-based biology learning environment by calculating effect-sizes of the experimental studies (e.g., Borenstein et al., 2009; Ellis, 2010; Üstün & Eryılmaz, 2014). Overall, it used the meta-analysis to handle related studies in a very organized and systematic way to draw out any effect or relationship between dependent (e.g., academic performance) and independent (e.g., web-based biology learning environment) variables (Atasoy, 2021; Lipsey & Wilson, 2001; Üstün & Eryılmaz, 2014).
Sample and Selection Criteria
In looking for the studies on web-based biology learning environment, the authors recruited several keyword patterns from the abstracts (e.g., Pattern 1: web-based learning and biology education; Pattern 2: web supported learning and biology education; Pattern 3: online learning and biology education; Pattern 4: computer supported learning and biology education) in well-known databases (e.g., ERIC, EBSCO, Springer Link, Taylor & Francis, Wiley Online Library Full Collection, Science Direct, ProQuest Dissertations and Theses Global, Sage Journals, Google Scholar, Scopus, and the Higher Education Council (HEC) Dissertations and Theses in Türkiye). Also, the authors carried out a manual search of the related journals and dissertations to make the meta-analysis study more eligible and efficient. Further, they took care to avoid duplication in that some papers were indexed in more than one database or some dissertations were published in journals as research papers. Therefore, the authors excluded 13 studies because of duplications in databases, and research papers from the dissertations. Then, they read all studies to apply the inclusion criteria (e.g., handing a biology topic within web-based learning environment, focusing on understanding and achievement as learning outcomes, and publication language—Turkish and English). Hence, the authors excluded 88 studies with different dependent variables (e.g., attitude, perception, awareness, motivation and skills—i.e., collaborative skills, science process skills, and social skills) (for example, Dyrberg et al., 2017; Ilma et al., 2022; Rao & Saha, 2019; Whittle & Bickerdike, 2015), Later, they looked over statistical data of each study to calculate any effect-size(s). Hence, they excluded three studies, which lacked sufficient data (e.g., mean, standard deviation, sample size, paired p-value, or paired t-value) for meta-analysis or only presented limited descriptive statistical results (e.g., frequency and percentage). For example, they excluded DeChenne-Peters et al.’s (2022) study, which reported the findings of normalized gain, ANOVA, and ANCOVA that include insufficient data for running meta-analysis. Finally, the authors identified 22 papers focusing on K-12 and university (named K-20 education system) and investigating the effect of web-based biology learning environment on academic performance. Figure 1 outlines the selection process.
Fig. 1.
Flow chart of the selection process
This study included Turkish and English papers and excluded other languages. Because Hedges’ g is an objective criterion to compare the studies with each other, the current study included both pre-experimantal and quasi-experimental research design to examine how the ‘type of experimental design’ moderator variable influences the students’ academic performance via web-based biology learning environment. Moreover, some of the studies had several experimental groups (e.g., problem-based learning, just-in-time teaching, evening class and morning class) or different dependent variables (e.g., achievement, biological understanding and scientific understanding) or different biology topics (e.g., ecology and evolution) (Gardner & Belland, 2017; Hill, 2013; Kırılmazkaya, 2014; McDaniel et al., 2007; Musawi et al., 2021; Nugraini et al., 2013; Son et al., 2016; Spernjak & Sorgo, 2018; Soubra et al., 2022). Given these issues, the authors inserted their statistical values (e.g., mean, standard deviation, and sample size for each of the experimental and control groups) into comprehensive meta-analysis (CMA) statistics software and deployed the ‘subgroups within the study’ and ‘use the study as the unit of analysis’ options to calculate combined effect-sizes. Hence, it preferred the use of a combined effect-size to an individual effect-size for each of the experimental groups, different dependent variables, or different biology topics. In brief, it attempted to minimize their impacts on the overall effect-size. Table 1 reveals the characteristics of the papers included in the meta-analysis.
Table 1.
Characteristics of the papers included in the meta-analysis
| Characteristics | Criteria | Papers | f |
|---|---|---|---|
| Publication year | 2002–2007 | Dean (2004), Gelbart and Yarden (2006), Gibbins et al. (2003), Hill (2013), Kopec (2002), Kumar and Sherwood (2007), McDaniel et al. (2007) | 7 |
| 2008–2013 | Nugraini et al. (2013), Sezen-Vekli (2012), Wesolowski (2008), Yu et al. (2010) | 4 | |
| 2014–2019 | Ertan (2019), Gardner and Belland (2017), Jacquemart et al (2016), Kırılmazkaya (2014), Marsteller (2017), Son et al. (2016), Spernjak and Sorgo (2018), Haro et al. (2019) | 8 | |
| 2020 + | Musawi et al. (2021), Nischal et al. (2022), Soubra et al (2022) | 3 | |
| Publication type | Article | Gardner and Belland (2017), Gelbart and Yarden (2006), Gibbins et al. (2003), Haro et al. (2019), Jacquemart et al (2016), Kumar and Sherwood (2007), McDaniel et al. (2007), Nischal et al. (2022), Nugraini et al. (2013), Son et al. (2016), Spernjak and Sorgo (2018), Soubra et al (2022), Yu et al. (2010) | 13 |
| Dissertation | Dean (2004), Ertan (2019), Hill (2013), Kırılmazkaya (2014), Kopec (2002), Marsteller (2017), Sezen-Vekli (2012), Wesolowski (2008) | 8 | |
| Proceedings | Musawi et al. (2021) | 1 | |
| Sample size | 1–30 | Gelbart and Yarden (2006) | 1 |
| 31–100 | Dean (2004), Ertan (2019), Gardner and Belland (2017), Gibbins et al. (2003), Haro et al. (2019), Kırılmazkaya (2014), Kumar and Sherwood (2007), Marsteller (2017), Musawi et al. (2021), Sezen-Vekli (2012), Wesolowski (2008) | 11 | |
| 101–300 | Hill (2013), Jacquemart et al (2016), Kopec (2002), McDaniel et al. (2007), Nischal et al. (2022), Nugraini et al. (2013), Son et al. (2016), Spernjak and Sorgo (2018), Soubra et al (2022), Yu et al. (2010) | 10 | |
| Grade | Middle school | Spernjak and Sorgo (2018), Yu et al. (2010) | 2 |
| High school | Dean (2004), Ertan (2019), Gelbart and Yarden (2006), Kopec (2002), Marsteller (2017), Nugraini et al. (2013), Sezen-Vekli (2012) | 7 | |
| Undergraduate | Gardner and Belland (2017), Gibbins et al. (2003), Haro et al. (2019), Hill (2013), Jacquemart et al. (2016), Kırılmazkaya (2014), Kumar and Sherwood (2007), McDaniel et al. (2007), Musawi et al. (2021), Nischal et al. (2022), Son et al. (2016), Soubra et al (2022), Wesolowski (2008) | 13 | |
| Academic performance | Achievement | Dean (2004), Ertan (2019),Gibbins et al. (2003), Haro et al. (2019), Hill (2013), Jacquemart et al. (2016), Kopec (2002), McDaniel et al. (2007), Nischal et al. (2022), Nugraini et al. (2013), Wesolowski (2008), Spernjak and Sorgo (2018), Yu et al. (2010) | 13 |
| Understanding | Gardner and Belland (2017), Gelbart and Yarden (2006), Kırılmazkaya (2014), Kumar and Sherwood (2007), Marsteller (2017), Musawi et al. (2021), Sezen-Vekli (2012), Son et al. (2016), Soubra et al (2022) | 9 |
Coding Procedure
Because coding makes data extracting process clear and transparent (Karadağ, 2020), the authors created a coding form (with reference of the paper, sample size, grade, dependent and independent variables and quantitative values—mean, standard deviation, t, and p) and applied it to the studies under investigation. Meanwhile, they separately coded the studies and calculated the inter-rater consistency co-efficient, which was found to be 0.84. Any disagreement was resolved through the negotiation.
Calculating Effect-Sizes
Effect-size is a standard measurement value to determine the strength and direction of the relationship between two variables or degree of practical effect (Borenstein et al., 2009). Cohen’s d and Hedges’ g are generally used to calculate the effect-sizes through the meta-analysis. However, if a sample size is relatively small, Hedges’ g is more accurate and less biased than Cohen’s d (Borenstein et al., 2009; Güler et al., 2022; Kansızoğlu, 2017). Hence, the authors preferred recruiting Hedges’ g calculation for the meta-analysis. They firstly inserted all statistical data for the studies into an Excel sheet and then imported into comprehensive meta-analysis (CMA) statistics software. They used various data from the studies to calculate Hedges’ g value:
16 studies included mean scores, standard deviations, and sample size for the experimental and control groups (quasi-experimental design).
4 studies provided pre- and post-mean scores, paired groups t-value, and sample size for the experimental group (pre-experimental design).
2 studies contained pre- and post-mean scores, paired p-value, and sample size for the experimental group (pre-experimental design).
The following range suggested by Güler et al. (2022) was used to interpret the effect-sizes: 0.14 and below (negligible); 0.15–0.39 (low); 0.40–0.74 (medium); 0.75–1.09 (large); 1.10–1.44 (very large); and 1.45 and above (perfectly huge).
Meta-analysis Model
Meta-analysis incorporates two main models: the fixed effects model and the random effects model (Karadağ, 2020). Before deciding meta-analysis model, the characteristics of the studies in the meta-analysis must meet some pre-requests (e.g., Borenstein et al., 2009; Karadağ, 2020). The fixed-effects model assumes that only one true effect-size appears for all studies in the meta-analysis and all differences in the observed effects result from only sampling error (Hedges & Vevea, 1998; Üstün & Eryılmaz, 2014). In view of Karadağ (2020), the fixed effects model handles the studies with the same functionality to estimate the effect-size for only one population. Nevertheless, the random-effects model claims that true effect-size may be different due to some moderator variables (e.g., age, grade, and sample size) (e.g., Borenstein et al., 2009; Üstün & Eryılmaz, 2014). Phrased differently, the random-effects model can be used to estimate and generalize the effect-size for greater populations if the studies do not have the equal functionality (Karadağ, 2020).
Given the foregoing explanations, the authors looked for the presence of heterogeneity by calculating Q-value and I2 test. As seen from Table 2, Q-value was counted to be 423.858, which is higher than critical value of 32.671 (with 21 degree of freedom for 95% confidence interval). Further, p-value was statistically less than 0.05. This means that the current meta-analysis initially met the heterogeneity to employ the random-effects model. Moreover, the authors examined I2 value before fully deciding the use of the random effects model (Borenstein et al., 2009). Given the heterogeneity criteria (low heterogeneity for nearly 25% of I2 value; medium heterogeneity for 50% of I2 value and high heterogeneity for 75% of I2 value) suggested by Higgins et al. (2003), the I2 value was calculated to be 95.046. This means that the present meta-analysis met the heterogeneity criterion to run the random effects model. Overall, they applied the random effects model in the meta-analysis processes to calculate the effect-sizes via a comprehensive meta-analysis (CMA V2) software. Moreover, educational level and type of experimental design were defined as moderator variables to respond the second research question. That is, ‘educational level’ variable covers middle school (grades 5–8), high school (grades 9–12), and university (undergraduate or bachelor) while ‘type of experimental design’ one embraces quasi-experimental and pre-experimental designs.
Table 2.
The findings of heterogeneity test
| Model | Number of studies | Effect size and 95% confidence interval | Test of null (2-tail) | Heterogeneity | Tau-squared | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Point estimate | Standard error | Variance | Lower limit | Upper limit | Z-value | P-value | Q-value | df (Q) | P-value | I-squared | Tau-squared | Standard error | Variance | Tau | ||
| Fixed | 22 | 0.287 | 0.028 | 0.001 | 0.232 | 0.343 | 10.155 | 0.000 | 423.858 | 21 | 0.000 | 95.046 | 0.362 | 0.170 | 0.029 | 0.601 |
| Random | 22 | 0.668 | 0.135 | 0.018 | 0.403 | 0.932 | 4.948 | 0.000 | ||||||||
Publication Bias
To identify and avoid any possible publication bias in the-meta-analysis criticized by some researchers (e.g., Üstün & Eryılmaz, 2014), the authors recruited the visual inspection of a funnel plot, the trim-and-fill method, the Classic fail-safe N, and Orwin’s fail-safe N. Thus, they provided the findings of publication bias (e.g., the number of missing studies needed to bring Hedges’ g under 0.1) and objectively interpreted them.
Findings
The Findings of Publication Bias
As seen from Fig. 2, the funnel plot pointed to an asymmetrical structure (e.g., Karadağ, 2020). That is, the studies, which had larger standard errors (smaller sample sizes), showed larger effects. Therefore, this could be viewed as evidence of publication bias (Rahman & Lewis, 2020). Given evidence of an asymmetric distribution in the funnel plot, the current meta-analysis also applied Duval and Tweedie’s trim and fill test to identify the publication bias for the random effects model. As can be seen from Table 3, there was a difference between the observed and adjusted values. Figure 3 (black dots in the funnel plot) represents a counterpart data to compensate for this asymmetry or to remedy the impact of observed asymmetry on the overall results. Further, the difference between the observed and adjusted estimates was found to be 21.68%, which fell into the ‘moderate’ cut-off value (20% to 40%) suggested by Chang et al. (2022). This means that the publication bias could be viewed as an acceptable issue to run the current meta-analysis for the effectiveness of web-based biology learning environment. Further, the current meta-analysis went over Classic fail-safe N and Orwin’s fail-safe N values for publication bias. As seen from Table 4, the classic fail-safe N value was 1333. This means that 1311 additional studies with non-significant findings are necessary to nullify the effect of web-based biology learning environment on academic performance. That is, considering the formula [N/(5 k + 10)] (where k means the total number of the studies in the meta-analysis), this ratio for the current meta-analysis was found to be 11.11, which is higher than the cut-off point (1.00) offered by Mullen et al. (2001). Given the cut-off point, the current meta-analysis was highly good at data selection process and did not have any publication bias. Likewise, Orwin’s fail-safe N value was 42 (Table 5). This means that 20 additional papers with effect-size of 0.00000 are necessary to make the mean effect of this meta-analysis as trivial (Üstün & Eryılmaz, 2014). To sum up, given the aforementioned values, it can be inferred that the current meta-analysis had no evidence of publication bias for the studies under investigation.
Fig. 2.
Funnel plot of standard error by effect-size
Table 3.
The finding of Duval and Tweedie’s trim and fill test for the random effects model
| Studies trimmed | Point estimate | Confidence interval (CI) | Q value | ||
|---|---|---|---|---|---|
| Lower limit | Upper limit | ||||
| Observed values | 0.66758 | 0.40315 | 0.93202 | 423.85844 | |
| Adjusted values | 4 | 0.85235 | 0.50585 | 1.19886 | 1337.15828 |
Fig. 3.
Funnel plot of standard error by effect-size after Duval and Tweedie’s trim and fill test (Black dots mean a counterpart data to offset the asymmetry or overcome the impact of observed asymmetry on the overall results)
Table 4.
The findings of Classic fail-safe N test
| Z-value for observed studies | 15.37934 |
| P-value for observed studies | 0.00000 |
| Alpha | 0.05000 |
| Tails | 2.00000 |
| Z for alpha | 1.95996 |
| Number of observed studies | 22 |
| Number of missing studies that would bring p value to > alpha | 1333 |
Table 5.
Findings of Orwin’s fail-safe N test
| Hedges’ g in observed studies | 0.28728 |
| Criterion for a ‘trivial’ Hedges’ g | 0.10000 |
| Mean Hedges’ g in missing studies | 0.00000 |
| Number missing studies needed to bring Hedges’ g under 0.1 | 42 |
The Overall Impact of Web-Based Biology Learning Environment on Academic Performance
Since visual representations of the effect-sizes clearly illustrate the findings of meta-analysis (Lipsey & Wilson, 2001), this research presents stem and leaf plot (Table 6), and forest plot (Fig. 4) to summarize them. As seen from Table 6, six of the studies had higher effect-sizes than 1.00. Also, the range between 0.5 and 1 included five studies, whilst 11 studies possessed lower effect-sizes than 0.5. As seen from Fig. 4, while 13 of them were significant (p < 0.05), 9 of them were non-significant (p > 0.05). Given Güler et al.’s (2022) effect-size classification, 7 of the effect-sizes in the meta-analysis fell into the negligible effect. Four of them were classified under the low effect, whilst 5 of them were labelled under the medium effect. Further, the studies categorized under the very large and perfectly huge effects had the same frequency value (f = 3). Also, overall effect-size for the random-effects model was found to be 0.668, which means that web-based biology learning environment has a medium effect on improving academic performance.
Table 6.
Stem and leaf plot of the effect-sizes
| Frequency | Stem | Leaf | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 3 | − 0 | 0 | 0 | 2 | ||||||||
| 8 | 0 | 0 | 0 | 1 | 1 | 2 | 3 | 3 | 3 | |||
| 5 | 0 | 5 | 5 | 5 | 7 | 7 | ||||||
| 4 | 1 | 1 | 3 | 4 | 4 | |||||||
| 1 | 2 | 4 | ||||||||||
| 1 | 3 | 0 | ||||||||||
Fig. 4.
Forest plot of the studies included in the meta-analysis
The Findings of Moderator Variables (Educational Level and Type of Experimental Design)
As can be seen from Table 7, the effect-size difference between the educational levels was found to be non-significant (Q-value = 2.084; df = 2; p > 0.05). This means that educational level, as a moderator variable, did not affect the students’ academic performance through web-based biology learning environment. In fact, the effect of web-based biology learning environment on academic performance was significant in high school and university (p < 0.05), whereas it was non-significant in middle school (p > 0.05). Moreover, the mean effect-size for high school (Hedges’ g = 0.744) was slightly higher than that for university (Hedges’ g = 0.737). Also, both of these values fell into the medium effect in regard to classification suggested by Güler et al. (2022).
Table 7.
The findings of moderator analysis for educational level
| Educational level | N | Point estimate | Standard error | Confidence interval (95%) | Z-value | p-value | Q-value | df (Q) | p-value | |
|---|---|---|---|---|---|---|---|---|---|---|
| Lower limit | Upper limit | |||||||||
| Middle school | 2 | 0.027 | 0.469 | -0.892 | 0.946 | 0.058 | 0.954 | |||
| High school | 7 | 0.744 | 0.263 | 0.229 | 1.260 | 2.831 | 0.005 | |||
| University | 13 | 0.737 | 0.188 | 0.368 | 1.105 | 3.920 | 0.000 | |||
| Total between | 2.084 | 2 | 0.353 | |||||||
As can be seen from Table 8, the effect-size difference between types of experimental design was found to be non-significant (Q-value = 2.784; df = 1; p > 0.05). This means that type of experimental design, as a moderator variable, did not influence the students’ academic performance through web-based biology learning environment. Indeed, the values of pre-experimental and quasi-experimental designs were significant (p < 0.05). Further, the mean effect-size for pre-experimental design (Hedges’ g = 1.032) was higher than that for quasi-experimental one (Hedges’ g = 0.530). Also, these values were labelled large and medium effects respectively according to the classification suggested by Güler et al. (2022).
Table 8.
The findings of moderator analysis for type of experimental design
| Type of experimental design | N | Point estimate | Standard error | Confidence interval (95%) | Z-value | p-value | Q-value | df (Q) | p-value | |
|---|---|---|---|---|---|---|---|---|---|---|
| Lower limit | Upper limit | |||||||||
| Pre-experimental design | 6 | 1.032 | 0.257 | 0.529 | 1.535 | 4.022 | 0.000 | |||
| Quasi-experimental design | 16 | 0.530 | 0.158 | 0.220 | 0.839 | 3.357 | 0.001 | |||
| Total between | 2.784 | 1 | 0.095 | |||||||
Discussion
The results of the meta-analysis showed that the effect-sizes of the studies ranged from − 0.242 to 3.030 (see Fig. 3). Such a large range may stem from the characteristics of the interventions, e.g., duration of intervention and match or mismatch between web-based biology learning environment and the nature of biology topics under investigation. Also, the overall effect-size of web-based biology learning environment was classified under the medium effect. This means that web-based biology learning environment is somewhat effective in improving students’ academic performance. This result is consistent with the findings of previous meta-analysis studies (e.g., Ergül-Sönmez & Çakır, 2021; Ulum, 2022). On the other hand, it has a categorical difference with the findings of Orhan and Durak-Men (2018), who found a relatively large effect-size for the effect of web-based teaching on students’ achievement. This may result from the scope of their meta-analysis study, which only focused on the Turkish context. Indeed, the medium effect in the present meta-analysis may come from the number of the studies labelled under the negligible, low, and medium effects. In other words, this may stem from the limited number of the studies classified under the very large and perfectly huge effects. The negligible and low effects may result from the content and context of web-based biology learning environment. For example, if web-based biology learning environment has limited student–teacher interaction and lacks peer collaboration and structured online discussion components, students may not find it meaningful and attractive (Castro & Tumibay, 2021). This may have resulted in low academic performance even after the implementation of web-based biology learning environment (Dean, 2004; Gardner & Belland, 2017; Kopec, 2002; Son et al., 2016). Furthermore, a short-term implementation may have reduced the effectiveness of web-based biology learning environment (Gardner & Belland, 2017; Kopec, 2002; Spernjack & Sorgo, 2018). The fact that some studies were categorized under the very large and perfectly huge effects may come from well-designed and enriched pedagogical approaches, for instance, inquiry-based learning, problem-based learning, and situated learning (Kırılmazkaya, 2014; Kumar & Sherwood, 2007; Marsteller, 2017; Musawi et al., 2021; Sezen-Vekli, 2012).
The Effect of Educational Level on Academic Performance
As seen from Table 7, the educational level, as a moderator variable, did not affect the students’ academic performance along with web-based biology learning environment (Q-value = 2.084; df = 2; p > 0.05). This indicates that the educational level does not play a significant moderator at estimating the students’ academic performance via web-based biology learning environment. Furthermore, as compared with high school and university levels, the overall effect-size of middle school fell into the negligible effect and was non-significant for the effect of web-based biology learning environment on academic performance. This may result from pre-requests of the WBLE. For example, students need to control information, access other relevant resources, communicate with peers, and upload any related information to the database if necessary (Castro & Tumibay, 2021). Thus, middle school students’ inability or limited skills to use computer, and web-based learning tools may have prevented the effectiveness of web-based biology learning environment in improving their academic performance. Also, such a negligible effect may come from the number of the studies with middle school students. Further, it may come from the nature of an integrated science course in middle school, which covers biology, chemistry, physics, and earth science. The fact that high school and university levels had the medium effect for academic performance along with web-based biology learning environment supports the aforementioned argument on significance of students’ preparedness. As a matter of fact, Castro and Tumibay (2021) emphasize that students need to have a high level of digital literacy and self-efficiency to successfully interact with peers, teachers, and the content of the WBLE. On the other hand, this result somewhat advocates the established expectations that an increase in educational level are supposed to evolve and improve students’ cognitive (e.g., understanding and achievement—called academic performance) and affective learning (e.g., self-efficiency, and self-confidence).
The Effect of Type of Experimental Design on Academic Performance
Non-significant difference between types of experimental design (Q-value = 2.784; df = 1; p > 0.05) (see Table 8) means that type of experimental design, as a moderator variable, had no effect on the students’ academic performance through web-based biology learning environment. The fact that the mean effect-size for pre-experimental design (Hedges’ g = 1.032) was higher than that for quasi-experimental one (Hedges’ g = 0.530) may come from the nature of pre-experimental design that only has one experimental group without comparison or control one (Kiryak & Çalik, 2018). In other words, different gains in pre-experimental and quasi-experimental designs (e.g., a result of treatment in pre-experimental design versus that of a type of treatment in quasi-experimental one) may have resulted in different effects—large and medium, respectively (Çalik et al., 2015; Fraenkel et al., 2012). Moreover, this may result from the number of the studies with pre-experimental design.
Conclusion and Implications
Given the mean effect-size of this meta-analysis, it can be concluded that web-based biology learning environment is somewhat effective at improving students’ academic performance. This means that some of the studies have difficulty integrating the WBLE into biology learning or classes. For this reason, further studies are needed to illustrate how to effectively intertwine the WBLE with biology learning. Moreover, this study claims that middle school is an improper educational level for the use of web-based biology learning environment. However, limited number of the studies with middle school students hinders further interpretation about it. Furthermore, given non-significant difference between types of experimental design, it can be deduced that type of experimental design does not act as a good moderator at estimating the students’ academic performance through web-based biology learning environment. The fact that the current study only concentrated on two moderator variables may be seen as the first limitation of the study. For this reason, future research may use different variables, e.g., implementation duration. Similarly, it also included the studies handling only biology topics in an integrated science course in middle school. This may be viewed as the second limitation of the study. Thereby, future research may extend the scope of the meta-analysis and handle type of course (biology, chemistry, physics, and earth science) as a moderator variable. Since the current study incorporates some extreme values (e.g., Musawi et al., 2021; Sezen-Vekli, 2012), further studies are needed to deepen their implementation processes and explore what makes them unique.
Author Contribution
The authors made an equal contribution to this paper.
Data Availability
The datasets analysed during the current study are available from the corresponding author on reasonable request.
Declarations
Ethical Statement
All procedures performed in this study followed the ethical standards of the Department of Health Standards on Human Research (DOH/QD/SD/HSR/0.9) and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.
Consent Statement
Not applicable.
Competing Interests
The authors declare no competing interests.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Contributor Information
Gülşah Sezen Vekli, Email: gulsahsezen28@gmail.com.
Muammer Çalik, Email: muammer38@hotmail.com.
References
- Adi, W. C., Saefi, M., Setiawan, M. E., & Sholehah, N. (2021). The impact of covid-19 to biology teacher education: Emergency distance learning at Islamic universities in Indonesia. Journal of Turkish Science Education, 18, Covid-19 Special Issue, 60–75. 10.36681/tused.2021.72
- Atasoy, A. (2021). The relationship between writing self-efficacy and writing skill: A meta-analysis study. Education and Science, 46(208), 213–236. 10.15390/EB.2021.10024
- Bahar M, Johnstone AH, Hansell MH. Revisiting learning difficulties in biology. Journal of Biological Education. 1999;33(2):84–86. doi: 10.1080/00219266. [DOI] [Google Scholar]
- Baram-Tsabari, A., Sethi, R. J., Bry, L., & Yarden, A. (2010). Identifying students’ interests in biology using a decade of self-generated questions. Eurasia Journal of Mathematics, Science and Technology Education, 6(1), 63–75. 10.12973/ejmste/75228
- Bayrak, N., Çalık, M., & Doğan, S. (2021). The effects of smart formative assessment system on academic achievement and course process. Hacettepe University Journal of Education, 36(2), 333–349. 10.16986/HUJE.2019056742 [DOI]
- Bell BS, Federman JE. E-learning in postsecondary education. The Future of Children. 2013;23(1):165–185. doi: 10.1353/foc.2013.0007. [DOI] [PubMed] [Google Scholar]
- Bennett J. Teaching and learning science: A guide to recent research and its application. Continuum; 2003. [Google Scholar]
- Bojović Ž, Bojović PD, Vujošević D, Šuh J. Education in times of crisis: Rapid transition to distance learning. Computer Applications in Engineering Education. 2020;28(6):1467–1489. doi: 10.1002/cae.22318. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis. West Sussex, UK: John Wiley & Sons, Ltd.
- Byukusenge C, Nsanganwimana F, Tarmo AP. Difficult topics in the revised biology curriculum for advanced level secondary schools in Rwanda: Teachers’ perceptions of causes and remedies. Journal of Biological Education. 2022 doi: 10.1080/00219266.2021.2012225. [DOI] [Google Scholar]
- Çalık, M., & Wiyarsi, A. (2021). A systematic review of the research papers on chemistry-focused socio-scientific issues. Journal of Baltic Science Education, 20(3), 360–372. 10.33225/jbse/21.20.360
- Çalik, M., Ebenezer, J., Özsevgeç, T., Küçük, Z. & Artun, H. (2015). Improving science student teachers’ self-perceptions of fluency with innovative technologies and scientific inquiry abilities. Journal of Science Education and Technology, 24, 448–460. 10.1007/s10956-014-9529-1
- Candaş, B. & Çalık, M. (2022). The effect of CKCM-oriented instruction on grade 8 students’ conceptual understanding of sustainable development. Journal of Biological Education. 10.1080/00219266.2021.2006748
- Castro MDB, Tumibay GM. A literature review: Efficacy of online learning courses for higher education institution using meta-analysis. Education and Information Technologies. 2021;26:1367–1385. doi: 10.1007/s10639-019-10027-z. [DOI] [Google Scholar]
- Chang, H.Y., Binali, T., Liang, J.C., Chiou, G.L., Cheng, K.H., Lee, S.W.Y. & Tsai, C.C. (2022). Ten years of augmented reality in education: A meta-analysis of (quasi-) experimental studies to investigate the impact. Computers & Education, 191, 104641. 10.1016/j.compedu.2022.104641
- Conejo R, Garcia-Viñas JI, Gastón A, Barros B. Technology-enhanced formative assessment of plant identification. Journal of Science Education and Technology. 2016;25(2):203–221. doi: 10.1007/s10956-015-9586-0. [DOI] [Google Scholar]
- *Dean, D. M. (2004). An evaluation of the use of web-enhanced homework assignments in high school biology classes. [Unpublished doctoral dissertation]. The University of Alabama.
- DeChenne-Peters, S. E, Sargent, E., Mateer, S. C., Machingura, M., Zettler, J., Ness, T., DeMars, G., Cannon, S., & Broftt Bailey, J. (2022). Comparison of student outcomes in a course-based undergraduate research experience: Face-to-face, hybrid, and online delivery of a biology laboratory. International Journal for the Scholarship of Teaching & Learning, 16(1), 1–17. 10.20429/ijsotl.2022.160105
- Dyrberg NR, Treusch AH, Wiegand C. Virtual laboratories in science education: Students’ motivation and experiences in two tertiary biology courses. Journal of Biological Education. 2017;51(4):358–374. doi: 10.1080/00219266.2016.1257498. [DOI] [Google Scholar]
- Ellis PD. The essential guide to effect sizes: Statistical power, meta-analysis, and the interpretation of research results. Cambridge University Press; 2010. [Google Scholar]
- Ergül Sönmez, E., & Çakır, H. (2021). Effect of Web 2.0 technologies on academic performance: A meta-analysis study. International Journal of Technology in Education and Science (IJTES), 5(1), 108–127. 10.46328/ijtes.161
- *Ertan, S. (2019). The effect of teaching mitosis and meiosis with computer assisted teaching material on academic success. [Unpublished master thesis], Gazi University.
- Fraenkel JR, Wallen NE, Hyun HH. How to design and evaluate research in education. 8. New York; 2012. [Google Scholar]
- Gardner J, Belland BR. Problem-centered supplemental instruction in biology: Influence on content recall, content understanding, and problem solving ability. Journal of Science Education and Technology. 2017;26(4):383–393. doi: 10.1007/s10956-017-9686-0. [DOI] [Google Scholar]
- Gelbart H, Yarden A. Learning genetics through an authentic research simulation in bioinformatics. Journal of Biological Education. 2006;40(3):107–112. doi: 10.1080/00219266.2006.9656026. [DOI] [Google Scholar]
- Gibbins S, Sosabowski MH, Cunningham J. Evaluation of a web-based resource to support a molecular biology practical class-does computer-aided learning really work? Biochemistry and Molecular Biology Education. 2003;31:352–355. doi: 10.1002/bmb.2003.494031050260. [DOI] [Google Scholar]
- Güler M, Bütüner SÖ, Danışman Ş, Gürsoy K. A meta-analysis of the impact of mobile learning on mathematics achievement. Education and Information Technologies. 2022;27:1725–1745. doi: 10.1007/s10639-021-10640-x. [DOI] [Google Scholar]
- Haro VA, Noroozi O, Biemans HJA, Mulder M. The effects of an online learning environment with worked examples and peer feedback on students’ argumentative essay writing and domain-specific knowledge acquisition in the field of biotechnology. Journal of Biological Education. 2019;53:390–398. doi: 10.1080/00219266.2018.1472132. [DOI] [Google Scholar]
- Hedges LV, Vevea JL. Fixed-and random-effects models in meta-analysis. Psychological Methods. 1998;3:486–504. doi: 10.1037/1082-989X.3.4.486. [DOI] [Google Scholar]
- Higgins JP, Thompson SG, Deeks JJ, Altman DG. Measuring inconsistency in meta-analyses. British Medical Journal. 2003;327(7414):557–560. doi: 10.1136/bmj.327.7414.557. [DOI] [PMC free article] [PubMed] [Google Scholar]
- *Hill, J. D. (2013). Student success and perceptions of course satisfaction in face-to-face, hybrid, and online sections of introductory biology classes at three, open enrollment, two-year colleges in southern Missouri. [Unpublished doctoral dissertation], Lindenwood University.
- Ilma, S., Al-Muhdar, M. H. I., Rohman, F., & Saptasari, M. (2022). Promote collaboration skills during the COVID-19 pandemic through predict-observe-explain-based Project (POEP) learning. Journal Pendidikan Biologi Indonesia, 8(1), 32–39. 10.22219/jpbi.v8i1.17622
- Incantalupo L, Treagust DF, Koul R. Measuring student attitude and knowledge in technology-rich biology classrooms. Journal of Science Education and Technology. 2014;23:98–107. doi: 10.1007/s10956-013-9453-9. [DOI] [Google Scholar]
- Jacquemart AL, Lhoir P, Binard F, Descamps C. An interactive multimedia dichotomous key for teaching plant identification. Journal of Biological Education. 2016;50(4):442–451. doi: 10.1080/00219266.2016.1150870. [DOI] [Google Scholar]
- Kang J, Seomun GA. Evaluating web based nursing education’s effects: A systematic review and meta-analysis. Western Journal of Nursing Research. 2018;40(11):1677–1697. doi: 10.1177/0193945917729160. [DOI] [PubMed] [Google Scholar]
- Kansızoğlu, H. B. (2017). The effect of graphic organizers on language teaching and learning areas: A meta-analysis study. Education and Science, 42(191), 139–164. 10.15390/EB.2017.677
- Karadağ E. The effect of educational leadership on students’ achievement: A cross-cultural meta-analysis research on studies between 2008 and 2018. Asia Pacific Education Review. 2020;21:49–64. doi: 10.1007/s12564-019-09612-1. [DOI] [Google Scholar]
- Khan BH, editor. Web-based instruction. Educational Technology Publications; 1997. [Google Scholar]
- *Kırılmazkaya, G. (2014). Web tabanlı araştırma-sorgulamaya dayalı fen öğretmen adaylarının kavram öğrenmeleri ve bilimsel süreç becerilerinin geliştirilmesi üzerine etkisi [The effects of web based inquiry science teaching development on preservice teachers concept learning and scientific process skills]. [Unpublished master thesis], Fırat University.
- Kiryak, Z. & Çalik, M. (2018). Improving grade 7 students’ conceptual understanding of water pollution via common knowledge construction model. International Journal of Science and Mathematics Education, 16, 1025–1046. 10.1007/s10763-017-9820-8
- *Kopec, R. H. (2002). Virtual, on-line, frog dissection vs. conventional laboratory dissection: A comparison of student achievement and teacher perceptions among honors, general ability, and foundations-level high school biology classes. [Unpublished doctoral dissertation], Seton Hall University.
- Kumandaş B, Atesken A, Lane J. Misconceptions in biology: A meta-synthesis study of research, 2000–2014. Journal of Biological Education. 2019;53(4):350–364. doi: 10.1080/00219266.2018.1490798. [DOI] [Google Scholar]
- Kumar D, Sherwood R. Effect of a problem based simulation on the conceptual understanding of undergraduate science education students. Journal of Science Education and Technology. 2007;16(3):239–246. doi: 10.1007/s10956-007-9049-3. [DOI] [Google Scholar]
- Lipsey MW, Wilson DB. Practical meta-analysis. Sage Publications; 2001. [Google Scholar]
- *Marsteller, R. B. (2017). Making online learning personal: Evolution, evidentiary reasoning, and self-regulatıon in an online curriculum. [Unpublished doctoral dissertation], Lehigh University.
- McDaniel CN, Lister BC, Hanna MH, Roy H. Increased learning observed in redesigned introductory biology course that employed web-enhanced, interactive pedagogy. CBE-Life Sciences Education. 2007;6:243–249. doi: 10.1187/cbe.07-01-0003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mioduser D, Nachmias R, Lahav O, Oren A. Web-based learning environments: Current pedagogical and technological state. Journal of Research on Computing in Education. 2000;33(1):55–76. doi: 10.1080/08886504.2000.10782300. [DOI] [Google Scholar]
- Morente L, Morales-Asencio JM, Veredas FJ. Effectiveness of an e-learning tool for education on pressure ulcer evaluation. Journal of Clinical Nursing. 2014;23:2043–2052. doi: 10.1111/jocn.12450. [DOI] [PubMed] [Google Scholar]
- Mullen B, Muellerleile P, Bryant B. Cumulative meta-analysis: A consideration of indicators of sufficiency and stability. Personality and Social Psychology Bulletin. 2001;27(11):1450–1462. doi: 10.1177/01461672012711006. [DOI] [Google Scholar]
- *Musawi, B. A. K., Salih, S., & Alaridhi, J. (2021). The effects of biotechnology virtual labs approach for enhancing of understanding 1st year university students. Proceedings of the 2021 Palestinian International Conference on Information and Communication Technology (PICICT), 175–179.
- Nischal S, Cabail MZ, Poon K. Combining virtual simulations with take-home projects as a replacement for face-to-face labs in introductory biology laboratory courses. Journal of Biological Education. 2022 doi: 10.1080/00219266.2022.2147206. [DOI] [Google Scholar]
- Nugraini SH, Choo KA, Hin HS, Hoon TS. Impact of e-AV biology website for learning about renewable energy. The Turkish Online Journal of Educational Technology. 2013;12(2):376–386. [Google Scholar]
- Orhan, A. T., & Men, D. D. (2018). Web tabanlı öğretimin fen dersi başarısına ve fen dersine yönelik tutuma etkisi: Bir meta analiz çalışması [The effects of web-based teaching on achievements and attitudes towards science course: A metaanalytic investigation]. Journal of Celal Bayar University Social Sciences, 16(3), 245−284. 10.18026/cbayarsos.465728
- Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Medical Education Online. 2019;24(1):1666538. doi: 10.1080/10872981.2019.1666538. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rahman T, Lewis SE. Evaluating the evidence base for evidence-based instructional practices in chemistry through meta-analysis. Journal of Research in Science Teaching. 2020;57(5):765–793. doi: 10.1002/tea.21610. [DOI] [Google Scholar]
- Rao DCH, Saha SK. An immersive learning platform for efficient biology learning of secondary school-level students. Journal of Educational Computing Research. 2019;57(7):1671–1694. doi: 10.1177/0735633119854031. [DOI] [Google Scholar]
- Ray S, Srivastava S. Virtualization of science education: A lesson from the COVID-19 pandemic. Journal of Proteins and Proteomics. 2020;11(2):77–80. doi: 10.1007/s42485-020-00038-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reeves SM, Crippen KJ. Virtual laboratories in undergraduate science and engineering courses: A systematic review, 2009–2019. Journal of Science Education and Technology. 2021;30:16–30. doi: 10.1007/s10956-020-09866-0. [DOI] [Google Scholar]
- Regmi K, Jones L. Effect of e-learning on health sciences education: A protocol for systematic review and meta-analysis. Higher Education Pedagogies. 2021;6(1):22–36. doi: 10.1080/23752696.2021.1883459. [DOI] [Google Scholar]
- Richmond H, Copsey B, Hall AM, Davies D, Lamb SE. A systematic review and meta-analysis of online versus alternative methods for training licensed health care professionals to deliver clinical interventions. British Medical Education. 2017;17(1):1–14. doi: 10.1186/s12909-017-1047-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rodríguez, C. L., Mula, J., Segovia, J. D., & Cruz-González, C. (2021). The effects of covid-19 on science education: A thematic review of international research. Journal of Turkish Science Education, 18, Covid-19 Special Issue, 26–45. 10.36681/tused.2021.70
- Rosenberg MJ. E-learning: Strategies for delivering knowledge in the digital age. McGraw-Hill; 2001. [Google Scholar]
- Santos ML, Prudente M. Effectiveness of virtual laboratories in science education: A meta-analysis. International Journal of Information and Education Technology. 2022;12(2):150–156. doi: 10.18178/ijiet.2022.12.2.1598. [DOI] [Google Scholar]
- Schmid RF, Telaro G. Concept mapping as an instructional strategy for high school biology. The Journal of Educational Research. 1990;84(2):78–85. doi: 10.1080/00220671.1990.10885996. [DOI] [Google Scholar]
- Sezen-Vekli, G. (2012). İnsan endokrin sistemi konusunda probleme dayalı bilgisayar destekli materyalin geliştirilmesi ve uygulanması [Developing and evaluating problem based computer aided material related to human endocrine system]. [Unpublished doctoral dissertation], Karadeniz Technical University.
- Soubra L, Al-Ghouti MA, Abu-Dieyeh M, Crovella S, Abou-Saleh H. Impacts on student learning and skills and implementation challenges of two student-centered learning methods applied in online education. Sustainability. 2022;14(9625):1–22. doi: 10.3390/su14159625. [DOI] [Google Scholar]
- Sitzmann T, Kraiger K, Stewart D, Wisher R. The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology. 2006;59:623–664. doi: 10.1111/j.1744-6570.2006.00049.x. [DOI] [Google Scholar]
- Son J, Narguizian P, Beltz D, Desharnais R. Comparing physical, virtual, and hybrid flipped labs for general education biology. Online Learning. 2016;20(3):228–243. doi: 10.24059/olj.v20i3.687. [DOI] [Google Scholar]
- Špernjak A, Šorgo A. Differences in acquired knowledge and attitudes achieved with traditional, computer-supported and virtual laboratory biology laboratory exercises. Journal of Biological Education. 2018;52(2):206–220. doi: 10.1080/00219266.2017.1298532. [DOI] [Google Scholar]
- Torum, O. (2003). Development of web based learning environment. Human Resources, 21– 22.
- Tüysüs C, Aydın H. Web tabanlı öğrenmenin ilköğretim okulu düzeyindeki öğrencilerin tutumuna etkisi [Effect of the web based learning on primary school students’ attitudes] Pamukkale Üniversitesi Eğitim Fakültesi Dergisi. 2007;22(22):73–78. [Google Scholar]
- Ulum H. The effects of online education on academic success: A meta-analysis study. Education and Information Technologies. 2022;27:429–450. doi: 10.1007/s10639-021-10740-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Üstün, U., & Eryılmaz, A. (2014). A research methodology to conduct effective research syntheses: Meta-analysis. Education and Science, 39(174), 1–32. 10.15390/EB.2014.3379
- *Wesolowski, M. C. (2008). Facilitating problem based learning in an online biology laboratory course. [Unpublished doctoral dissertation]. University of Delaware.
- Whittle SR, Bickerdike SR. Online preparation resources help first year students to benefit from practical classes. Journal of Biological Education. 2015;49(2):139–149. doi: 10.1080/00219266.2014.914554. [DOI] [Google Scholar]
- Yu WF, She HC, Lee YM. The effects of web-based/non-web-based problem-solving instruction and high/low achievement on students’ problem-solving ability and biology achievement. Innovations in Education and Teaching International. 2010;47(2):187–199. doi: 10.1080/14703291003718927. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The datasets analysed during the current study are available from the corresponding author on reasonable request.




