Skip to main content
PLOS One logoLink to PLOS One
. 2021 Apr 2;16(4):e0249627. doi: 10.1371/journal.pone.0249627

Is project-based learning effective among kindergarten and elementary students? A systematic review

Marta Ferrero 1, Miguel A Vadillo 2,*, Samuel P León 3,*
Editor: Mingming Zhou4
PMCID: PMC8018635  PMID: 33798245

Abstract

Project-based learning (PjBL) is becoming widespread in many schools. However, the evidence of its effectiveness in the classroom is still limited, especially in basic education. The aim of the present study was to perform a systematic review of the empirical evidence assessing the impact of PjBL on academic achievement of kindergarten and elementary students. We also examined the quality of studies, their compliance with basic prerequisites for a successful result, and their fidelity towards the key elements of PBL intervention. For this objective, we conducted a literature search in January 2020. The inclusion criteria for the review required that studies followed a pre-post design with control group and measured quantitatively the impact of PBL on content knowledge of students. The final sample included eleven articles comprising data from 722 students. The studies yielded inconclusive results, had important methodological flaws, and reported insufficient or no information about important aspects of the materials, procedure and key requirements from students and instructors to guarantee the success of PjBL. Educational implications of these results are discussed.

Introduction

Over the last decade, numerous institutions have addressed the skills and dispositions that are expected to be vital for schooling in 21st century. Some of these skills are critical thinking, communication, collaboration, or creativity [1,2]. According to many experts, although the prevailing methods of direct instruction and recitation may be effective for the acquisition of factual knowledge, these skills demand new pedagogical approaches [3]. Within this context, project-based learning (PjBL) and problem-based learning (PBL) have emerged as valuable inquiry approaches to achieve the so-called skills for the 21st century [4].

PjBL and PBL are usually described as active, student-centred methods of instruction that encourage students to work in collaborative groups on real-world questions or challenges to promote the acquisition of higher-order thinking skills, while teachers act as facilitators of learning [414]. Despite these common characteristics, PjBL and PBL also present some noticeable differences. For instance, in PjBL learners are expected to follow correct procedures towards a desired end-product or presentation during which they are likely to encounter different problems [7,9,14], while in PBL the emphasis is on the role of the students to define the problem and develop a solution [9,15,16]. In addition, while in PBL the solution to the problem is merely suggested, in PjBL it must be executed [7]. Finally, PjBL occurs over an extended time period, while PBL normally lasts a few days [5]. In practice, given the usual difficulties in distinguishing one from the other or in defining their key features [5,9,14], both terms are often employed interchangeably among researchers [4,14] and teachers [17]. Since both approaches are closely related and share a central end, throughout this review we will use the term PjBL to refer to both of them.

PjBL originated in an architecture school in Rome in the 16th century [18]. Forced by organizational and curricular constraints, lectures were moved to weekends and, to minimize the potential lack of motivation among students, teachers decided to use this approach. Later on, dissatisfaction with standard methods in medical education led a large number of medical schools to adopt PjBL [6], which progressively extended to different undergraduate studies [10,15,19]. The main reasons for adopting this approach were student disenchantment and boredom caused by the vast amount of information they had to learn with presumably little impact on daily practice [6]. In general, the quantitative reviews performed in medical schools show that the traditional approach to learning in the classroom outperforms PjBL in the acquisition of basic science knowledge, while, conversely, PjBL is superior to the traditional approach when it comes to learning clinical problem solving, that is, application of knowledge [8,2023] and ability to link concepts [19,24]. More generally, different studies conducted with undergraduate students have shown that PjBL can help students improve academic achievement [25] and build flexible knowledge [10].

In spite of the promising results of PjBL, some authors have drawn attention to the existing gaps in our knowledge about the conditions under which PjBL can be more beneficial than other approaches [21]. Similarly, researchers have outlined the importance of considering some prerequisites necessary for students and teachers to be successful in higher education when using PjBL. In the case of students, these requisites include the previous acquisition of basic content knowledge about the target problem or project and competence in some learning strategies and skills (i.e., the ability to communicate ideas effectively). For teachers, the requisites include, for instance, proficiency in appropriate teaching strategies and tools (i.e., the provision of adequate scaffolding). If these prerequisites are not met, students might not benefit from PjBL and teachers might not be able to apply it with any guarantee of success [7]. Finally, due to the various ways in which PjBL has been implemented in the classroom, it is important to pay attention to the fidelity with which its main principles are applied when evaluating its impact on learning. Ideally, an intervention faithful to the PjBL approach should include all its essential components as defined in the literature. Otherwise, there is a risk of attributing the (positive or negative) effects of an intervention to PjBL when, in fact, the intervention does not meet the definition of PjBL. As mentioned above, some of the central elements to PjBL are the need of a problem to drive the activities and a final artifact or product; the use of group work methodology; the empowerment of students; the provision of guidance and resources by teachers; and the adoption of evaluation tools adapted to PjBL characteristics (i.e., notebook entries or portfolio).

The effectiveness of PjBL has also been tested at the secondary school level, although to a lesser extent than in medical schools and undergraduate studies. As in the case of undergraduate settings, this approach has been shown to improve the academic achievement of secondary school students in different subjects, such as economics [26,27], history [28], or STEM (Science, Technology, Engineering, and Mathematics) [2931]; for a review, [5,25,32]. In spite of these promising results, some researchers have warned of the limited number of scientific studies on PjBL instruction in high school and emphasize the need for more and better research before strong claims can be made about the potential benefits of this approach [5,25,33]. Furthermore, most of the studies conducted to date followed quasi-experimental designs, so the existing evidence on the impact of PjBL in secondary school level appears to be weak [34].

At present, a growing number of kindergarten and primary schools are introducing PjBL in their classrooms. Even more, in countries like Spain, the educational authorities of some regions have made the inclusion of PjBL in classroom programmes mandatory [35]. Considering the good results obtained in higher levels, it is reasonable to expect that PjBL would also contribute to promoting the learning of kindergarten and primary students. Nevertheless, due to the considerable differences between senior and novice learners [36], this assumption deserves further analysis. Unlike the cases mentioned above, there is still no systematic review on the efficacy of PjBL exclusively focused on these basic levels of education. To our knowledge, there are two non-systematic reviews and one meta-analysis that have addressed the effectiveness of PjBL in different levels, including to some extent kindergarten and primary education. The first one focuses on the effect of PjBL in students from kindergarten to K-12. It includes both quantitative and qualitative studies [11]. The second one is an overview of the effectiveness of PjBL from preschool to higher education and pre-service teacher training [12]. And the third one analyses quantitatively the impact of PjBL on academic achievement in comparison with traditional teaching from third grade elementary school to senior college students and explores what study features might moderate this effect [25]. Overall, these studies conclude that PjBL is an effective means of teaching content information. However, in all cases, important pieces of information are missing from the studies analysed. For instance, none of the reviews assess the level of student and instructor compliance with the basic requirements of PjBL. Similarly, the fidelity of interventions to the main principles of PjBL is not analysed. Finally, only one of the studies [11] analyses the information related to the quality of the primary studies. Considering that the authors of these reviews have highlighted the need of better and more detailed research, it seems advisable to report and discuss this type of information more thoroughly. Without this information, it is difficult, if not impossible, to draw firm conclusions about the effectiveness of PjBL for kindergarten and elementary school students.

The main objective of the present study was to perform a systematic review on the effect of PjBL on the acquisition of content knowledge in kindergarten and primary students, including as much relevant information as possible on methodological and conceptual aspects. Specifically, we examined the quality of existing studies, their compliance with basic prerequisites for a successful PjBL intervention, and the fidelity to the interventions in light of the key elements of PjBL, as reported in the literature.

Method

Search procedures

The present systematic review follows the PRISMA recommendations. On January 23th 2020 the first author (MF) performed an electronic search on the Web of Science, PsycInfo, and ERIC entering the terms “(project based OR problem based) AND (learning OR intervention OR approach OR instruction)” into the Topic field. The search was limited to (a) articles in English, (b) published between 1900 and 2020, (c) with categories restricted to “education/educational research” and “psychology”. Unpublished dissertations, reviews and meta-analyses were excluded at this stage. After removing 523 duplicates this initial search yielded a sample of 34,246 studies.

The titles and abstracts of these studies were screened by MF using the inclusion criteria c1-c5 explained below. This resulted in the exclusion of 32,208 studies that did not meet the inclusion criteria. MF and SPL independently read the full text of the remaining 38 studies to verify that they fulfilled criteria c1-c5. Among the initial set of 38 articles assessed for eligibility, nine articles met the inclusion criteria. Thereupon, we performed descendancy searches of articles citing or cited by these nine papers to identify additional studies. The titles and abstracts of the second search were screened by MF and this resulted in 16 full-text articles that were also independently read by MF and SPL. No additional study was selected from this set. Finally, on request of an anonymous reviewer, we added two extra studies included in a meta-analysis. Therefore, the final sample of articles reviewed for inclusion comprised eleven articles (see Table 1) [3747]. Fig 1 shows a PRISMA flowchart summarizing the literature search process. Across all the full-text articles read for inclusion, the initial inter-rater agreement was 98.31%. Disagreements were resolved by discussion and consensus between the two researchers until there was 100% agreement.

Table 1. Articles that met inclusion and quality criteria.

Authors, year Country Sample size (E/N) Age (mean) Sample type Educational level School type Instructor Duration Dependent variable Tests to measure DV Results
Alacapinar, 2008 [37] Turkey 42 (21, 21) (11.4 years) normal population 5th grade n.s. n.s. n.s. Cognitive domain n.s. PBL group outperformed significantly control group.
Aral et al., 2010 [38] Turkey 28 (14, 14) 6 year normal population Preschool education n.s. n.s. 12 weeks (1 day per week) Children’s conceptual development and school readiness composite Bracken Basic Concept Scale-Revised No differences between PBL and control group.
Aslan, 2013 [39] Turkey 47 (24, 23) 6 year normal population Preschool education public school Teacher 12 weeks (3 days per week) Categorization skills A categorization test PBL group outperformed significantly control group.
Çakici et al., 2013 [46] Turkey 44 (22, 22) n.s. normal population 5th grade public school Teacher and researcher 5 weeks Sciences knowledge The Light and Sound Achievement Test PBL group outperformed significantly control group.
Can et al., 2017 [40] Turkey 26 (17, 9) 6 year normal population Preschool education n.s. Teacher 32 weeks Scientific process skills and conceptions Preschool Scientific Process Skills Scale No comparison reported between PBL and control group.
Gültekin, 2005 [41] Turkey 40 (20, 20) n.s. normal population 5th grade n.s. n.s. 3 weeks (6 hours per week) Achievement in social studies An achievement test No quantitative data reported.
Hastie et al., 2017 [42] EEUU 185 (109, 76) (10.6 years) normal population 5th grade rural school Teacher and researcher 9 week Fitness knowledge Fitness Knowledge Test PBL group outperformed significantly control group.
Karaçalli et al., 2014 [43] Turkey 143 (73, 70) 9–11 years normal population 4th grade n.s. Teacher and researcher 4 weeks Sciences knowledge Electricity in Our Life Achievement Test (ELACH), Science Course Attitude Scale (ELATT) PBL group outperformed significantly control group.
Kucharski et al., 2005 [47] EEUU 61 (30, 31) n.s. normal population 1st, 3th and 4th grade n.s. Teacher n.s. Sciences knowledge Terra Nova Scale PBL group outperformed significantly control group (except in 4th grade).
Lin, 2015 [44] Taiwan 56 (28, 28) 11 years normal population 5th grade public school Teacher 12 weeks (40 min per week) Vocabulary knowledge Vocabulary knowledge test No differences between PBL and control group.
Zumbach et al., 2004 [45] Germany 50 (24, 26) (10.1 years) normal population 4th grade n.s. Teacher and computer n.s. Forest animals knowledge A konowledge test No differences between PBL and control group in the short-term but yes in the long- term for PBL group.

Note: DV: Dependent variable. (E) Experimental Group. (C) Control Group. n.s.: Not specified.

Fig 1. PRISMA flowchart.

Fig 1

Flow of information through the different phases completed in the systematic review.

Selection criteria

The studies were only included if they met the following criteria: c1) the aim was to evaluate the effect of PjBL on content knowledge; c2) they followed a pre-post design with control group; c3) the target sample comprised students from kindergarten to grade 6; c4) they were written in English; and c5) they were peer-reviewed. Therefore, narrative and systematic reviews, doctoral dissertations, posters, registered study protocols, commentaries, books and book chapters, essays, and other theoretical reports were excluded from the review.

Data extraction and coding

The eleven studies that met the inclusion criteria were independently examined in depth and coded by MF and SPL. They recorded information related to general aspects (authors, year of publication, and journal), participants (country of origin, sample size, age, educational level, and school type), method (design, duration, dependent variable, and measuring tools), and the main results obtained by each study.

In order to overcome important shortcomings of the reviews mentioned above, we used the quality scale developed by [48] with just one modification (see below). Very briefly, the original 17-item scale includes information related to the quality of various methodological aspects of an empirical research such as randomisation, blinding, replicability, or test validity (see Fig 2). Each item could be assigned three values: positive, negative, and unknown. For each study, MF and SPL independently assigned a value to each item, reaching an initial agreement of 98.30%. Disagreements were resolved through discussion until 100% consensus was reached. Fig 2 shows the values assigned to each item and study.

Fig 2. Scale of quality and values assigned to each item.

Fig 2

Summary of the items which comprise the scale of quality and values assigned to each of them in each study.

The quality scale used in [48] was originally created to assess educational interventions inspired by the multiple intelligences theory. Unlike research in that field, the PjBL literature offers a wealth of information on the basic prerequisites that both students and instructors should meet for PjBL to be successful, as well as on the key principles that characterize this approach. Therefore, for this study, we removed Item 6 from the original scale (referring to intervention fidelity) and replaced it by a full new scale intended to analyze both the compliance of teachers and students with the basic prerequisites of PjBL and the fidelity of the intervention to the principles underlying this method. This new scale consists of 30 items divided in two parts. Part A refers to the prerequisites and Part B refers to intervention fidelity. The 14 items in Part A are grouped into three categories. Items a1 to a6 belong to the category “Previous training of students in group work”, Item a7 to “Measurement of prior knowledge of students”, and Items a8 to a14 to “Teacher training in PjBL”. The 12 items of Part B are grouped into seven categories. Item b1 belongs to the category “Realism of the matter raised”, Item b2 to “Existence or not of a final product”, Item b3 to “Inclusion or not of group work”, Items b4 to b7 to “Scaffolding by the teacher during learning”, Item b8 to “Autonomy granted to students when making decisions about the project”, Items b9 to b12 to “Correct evaluation tools employed”, and Items b13 to b16 to “Explicit practice of metacognitive skills”. The categories which conform the scale were elaborated based on the principles suggested by reference review works in this field [5,7,9,11,14]. For the sake of consistency, each category in the scale must have been mentioned by at least two of these reference sources. As in the quality scale mentioned above, each item could obtain one of three values. MF and SPL independently scanned all the studies and assigned a value to each item, reaching an initial agreement of 99.63%. Disagreements were resolved by discussion and consensus between the two researchers until there was 100% agreement. Fig 3 shows a detailed description of the values assigned to each item.

Fig 3. Scale of prerequisites and intervention fidelity.

Fig 3

Summary of the prerequisites necessary for students and teachers for a successful PjBL adoption and intervention fidelity criteria in light of the key elements of the method.

Results

Description of the studies

Table 1 provides a detailed summary of the eleven studies included in this review. Overall, many of the coded elements showed substantial heterogeneity, such as sample size, age of participants, duration of the interventions, or reported outcomes. Specifically, the total sample consisted of 722 participants, aged between 6 and 11 years. Among them, 101 were kindergarten students and 621 were first- to sixth-grade students. The interventions lasted between 4 and 32 weeks. Some of the subjects covered were science, mathematics, or English.

As can be seen in Table 1, most of the studies included in this review reported positive effects of PjBL on academic achievement. More precisely, six studies showed significant improvement of students trained through PjBL compared to students trained with other methods; three studies obtained improvements through both methods (without making comparisons between the experimental and control groups); one study found no significant difference between PjBL and other types of training; and the remaining study reported no quantitative data.

Quality scale

Fig 2 shows the results of the qualitative assessment of the eleven studies included in the review. Across all items, 28.98% were rated as positive, 41.48% as negative, and 29.55% as unknown. Most of the studies followed a quasi-experimental design (Items 2 and 3) and did not include an active control group (Item 10). None of the studies guaranteed blinding of participants, instructors, and evaluation or, alternatively, did not report any information on this matter (Items 4 to 6). Similarly, most of the studies failed to provide enough information to replicate the intervention or the dependent variable (Items 11 and 12), and no study informed about the validity of the latter (Item 14). None of the studies had been preregistered or made the data publicly available on the Internet (Items 1 and 16). In contrast, most of the studies confirmed the similarity of the experimental and control groups in terms of socio-economic characteristics (Item 7). Likewise, most of the studies reported the analysis of pre-test scores in experimental and control groups (Item 8) and analyzed the differences between them (Item 15).

Prerequisites and intervention fidelity scale

Fig 3 shows a summary of the information related to the prerequisites and intervention fidelity of the studies. As can be seen, overall 20.61% of the items obtained positive values, 0.61% obtained negative values, and 78.79% were labeled as unknown. Most studies offered little or no information to assess the items related to compliance with prerequisites (Part A). Only one study reported specific information about the training of students in group work (Items a1 to a6) and it focused exclusively on the ability to discuss ideas (Item a1). Similarly, just two studies informed about the training of teachers in PjBL (Item a8 to a14) but none of them provided any information about the content of this training and, consequently, they were coded as “unknown”. Finally, information related to prior knowledge of students before starting the project was reported in three studies (Item a7).

In comparison, the studies reported more information about the intervention fidelity (Part B). Overall, 51.82% of the items obtained positive values, 1.82% obtained negative values, and 46.36% were labeled as unknown. Items b1-b3, coding for the realism of the problem, the existence of a final product, and the inclusion of group work were relatively well reported and received positive scores. Within the items focused on scaffolding, Items b4 and b6 were met by more than half of the studies, while Item b5 was only reported by two studies and Item b7 was not addressed in any study. Item b8, related to the autonomy provided to students, was well reported by more than half of the studies, but, importantly, two of them received negative scores. The rest of the items related to the appropriateness of evaluation tools (Items b9-b12) and to the explicit practice of meta-cognitive skills (Items b13-b16) were generally reported with insufficient detail, except for Item b16, where 6 studies obtained positive scores. Overall, the information about intervention fidelity was often reported too vaguely and had to be inferred indirectly from information scattered throughout the papers. For example, in the study of Alacapinar (2008) [37], Item b13, related to planning skills, was inferred on the basis of the following statement: "[Students] learned by experience how important it is to plan work and accomplish it in a given time" (p. 28).

Discussion

PjBL is a student-centered methodology that promotes the acquisition of higher-order thinking skills thought the solution of real problems in collaborative groups and with limited guidance of the teacher [6,9,10]. Although this approach has become the cornerstone of innovative movements in many schools [49,50], the evidence supporting its effectiveness in the classroom is still scarce [5,51]. The objective of the present review was to assess the available evidence about the impact of PjBL on the acquisition of content knowledge by kindergarten and primary students.

The articles aimed at examining the impact of PjBL in kindergarten and primary students were scarce and, overall, yielded mixed results. Specifically, seven of the 11 studies included in this review obtained positive results regarding the impact of PjBL in academic achievement of students immediately after the intervention [36,38,41,42,46,47] or in the long term [44]. Among the rest, two studies did not find significant differences between the experimental and control groups [37,43], one study did not report any quantitative data [40], and another one offered no comparison between the experimental and control groups [39]. In addition, the studies showed considerable heterogeneity in terms of participants’ age (from preschoolers to 11 years old students), duration of the intervention (ranging from 3 to 32 weeks), and measured outcomes (e.g., categorization skills or English knowledge). This hinders the generalization of the results to the entire school population.

Along with the mixed results obtained, an in-depth analysis of the studies showed important shortcomings that deserve more attention in future research. Firstly, there is room for improvement in the methodological quality of the studies. For instance, none of them followed an experimental design and only two included an active control group. These deficiencies make it hard to draw meaningful conclusions from the results. In fact, if all of these results had been collated in a quantitative meta-analysis without a proper analysis of their quality, most likely the conclusions would have been deceivingly positive. Without more and better evidence, it is difficult to assess whether PjBL is effective for kindergarten and elementary school students.

Secondly, information concerning important aspects of the materials and procedure was usually not reported or, when reported, revealed suboptimal methods, compromising the replicability of the studies. For example, many authors did not provide information about the tests used to measure the outcomes, the specific activities performed, or the intervention materials. Even if PjBL was successful, in all these cases it would be impossible to bring the intervention proposals to the classroom. This becomes more concerning if we consider the lack of a universally accepted model of PjBL [14]. In the same vein, none of the studies granted access to the data, which means that the reproducibility of the results cannot be verified by independent researchers.

Thirdly, few studies reported sufficient information to ensure that the interventions met the necessary requirements from students and instructors to guarantee the success of PjBL. For instance, only three studies measured prior knowledge of students before the intervention, only one offered information about students’ training in group work, and none described the training of instructors in PjBL, had it existed. The importance of these elements is often highlighted in the literature [5,7,9,15,52]. Precisely, a recent literature review by [33] highlighted two of them as essential for the success of PjBL: effective group work of pupils and support to teachers through regular networking and professional development opportunities. Given the lack of detailed information on these aspects in the studies included in this review, it is impossible to weight the contribution of these factors to the final results.

Finally, regarding intervention fidelity, although several of the key components of PjBL were well covered in the majority of studies (e.g., the use of real word-problems, the elaboration of a final product, or collaborative work), others were broadly neglected (e.g., the amount of guidance provided to students during the intervention, the evaluation tools used by teachers, or the training on metacognitive skills). As in the case of the prerequisites mentioned above, a considerable volume of research has stressed the importance of considering these elements in PjBL, including the monitoring of students [10,15,33,53] or the employment of adequate assessment tools to measure the progress of pupils [11,33]. These information gaps impede to determine what is decisive in this kind of intervention to be effective and, at the same time, hamper the distinction between PjBL and other educational interventions. This concern has been raised in previous reviews [21,54].

Classroom implications and future research

PjBL provides highly desirable benefits for students, such as the creation of independent, self-regulated learners [12,55,56], the promotion of engagement towards learning [9,50,5760], or the fostering of meaningful learning [50,61]. However, more and better evidence is needed about how and when PjBL is most suitable. For the purpose of this review, it is relevant to consider that the majority of studies assessing the impact of PjBL on learning have been aimed at higher education students [9]. But what is effective in a secondary or a postsecondary setting may not transfer directly to kindergarten and primary students [25]. It would be convenient to reflect on the suitability of this approach for younger students. Specifically, it should not be assumed that novice learners possess the advanced self-regulation skills, prior knowledge, or group work skills (for example, the abilities needed to discuss ideas, consider alternatives, or compare different points of view) necessary for PjBL [7,9,36]. Hence, any attempt to translate the main results and conclusions of this literature to kindergarten and primary students should be properly monitored. Apart from the educational stage, little is known about how different learning profiles might make PjBL more or less effective, as in the case of learners with different educational background or those with learning disabilities [9,14,25]. The studies included in this review do not contribute to this question, since the population of all them is composed by students without special needs.

Last but not least, we think that future research in this domain should try to overcome the shortcomings we encountered in conducting this review. This includes the lack of active control groups; the lack of randomly assigned participants; inappropriate blinding of participants, instructors, and evaluators; un-validated measures for the learning outcomes; or the lack of detailed information to replicate the study (e.g., activities conducted, approximate duration of each session, or evaluation tools used). Future research should also address the impact of some basic prerequisites by students (e.g., group work) and teachers (e.g., evaluation tools) on PjBL intervention. Similarly, it would be advisable that researches provide detailed information about the fidelity of the intervention to key features of PjBL [11,14,33]. These important shortcomings should also be taken into consideration in interpreting or applying already published interventions.

The academic success of many students, especially those with learning difficulties, depends largely on the use of methods that have proven to be consistently effective [62]. For this to be possible, the incorporation of research findings into decision making process, along with the tacit knowledge, values, and thoughts of educators, becomes indispensable. The adoption of this approach, known as research-informed practice, is a daunting challenge and involves many different actors and stakeholders [63,64]. In view of the above, researchers can surely contribute to this aim by providing more and better evidence on the conditions under which PjBL is effective.

Limitations of the present review

The main limitation of the present review is the scarce number of studies found. From the almost 40 full-text articles initially screened for eligibility, just nine met the selection criteria. Similarly, from the 30 studies contained in the meta-analysis suggested by one anonymous reviewer, just two were finally added. In light of this, it is difficult to draw firm conclusions about the effectiveness of PjBL for kindergarten and elementary school students, beyond highlighting shortcomings that should be addressed in future research. A wider search for studies, perhaps not limited to peer-reviewed articles (such as papers presented at conferences), might have yielded more results, although this option would most likely diminish the average quality of the final sample of studies. Besides, given that PjBL is particularly recommended for the development of domain-general skills [5,9], it would have been interesting to test the impact of PjBL not only on academic achievement but also on the development of higher-order skills, such as problem solving, critical thinking, deep understanding, or self-evaluation of students.

Supporting information

S1 checklist

(DOC)

Data Availability

All relevant data are reported within the paper. As the present systematic review does not include a quantitative meta-analysis, there are no additional datafiles, beyond the information reported in the results section.

Funding Statement

Funded by Agencia Estatal de Investigación with grant number PSI2017-85159-P (MAV), Comunidad de Madrid (ES) with grant number 2016-T1/SOC-1395 (MAV) and Comunidad de Madrid (ES) with grant number 2020-5A/SOC-19723 (MAV). Funders did not play any role in any phase of the study.

References

  • 1.Lamb S, Maire Q, Doecke E. (NSW Department of Education). Key skills for the 21st Century: An evidence-based review. Final report. Melbourne: Victoria University, Center for International Research on Education Systems (CIRES); 2017 Aug. [cited 2020 May 28]. Available from http://vuir.vu.edu.au/35865/1/Key-Skills-for-the-21st-Century-Analytical-Report.pdf.
  • 2.National Education Association (2012). Preparing 21st century students for a global society. Washington, D.C. (US): The Association; 2012 March.
  • 3.Gaikwad SS, Baharathi SV. An exploratory study on the application of multiple intelligences to MBA andragogy with particular reference to ERP-controlling configuration course. International. Journal of Information and Communication Technology Education. 2018;14:58–71. [Google Scholar]
  • 4.Jensen KJ. (2015). A meta-analysis of the effects of problem-and project-based learning on academic achievement in grades 6–12 populations. Education Dissertations. [internet] 2015 Aug [cited 2020 May 28];7. Available from http://digitalcommons.spu.edu/soe_etd/7. [Google Scholar]
  • 5.Angelle S. Project-based and Problem-based Instruction: A Literature Review [dissertation] Western Kentucky University (WKU): Bowling Green, Kentucky; 2018.
  • 6.Barrows H S. Problem‐based learning in medicine and beyond: A brief overview. New directions for teaching and learning. 1996;68: 3–12. [Google Scholar]
  • 7.Blumenfeld PC, Soloway E, Marx RW, Krajcik JS, Guzdial M, Palincsar A. Motivating project-based learning: Sustaining the doing, supporting the learning. Educ Psychol. 1991;26:369–398. [Google Scholar]
  • 8.Dochy F, Segers M, Van den Bossche P, Gijbels D. Effects of problem-based learning: A meta-analysis. Learn Instr. 2003;13:533–568. [Google Scholar]
  • 9.Harmer N, Stokes A. The benefits and challenges of project-based learning: A review of the literature. Plymouth, MA: Pedagogic Research Institute and Observatory (PedRIO); 2014 [cited 2020 May 28]. Available from https://www.plymouth.ac.uk/uploads/production/document/path/2/2733/Literature_review_Project-based_learning.pdf.
  • 10.Hmelo-Silver CE. Problem-based learning: What and how do students learn? Educ Psychol Rev. 2004;16:235–266. [Google Scholar]
  • 11.Holm M. Project-based instruction: A review of the literature on effectiveness in prekindergarten through 12th grade classrooms. River Academic Journal. 2011;7:1–13. [Google Scholar]
  • 12.Kokotsaki D, Menzies V, Wiggins A. Project-based learning: A review of the literature. Improving Schools. 2016:1–11. [Google Scholar]
  • 13.Krajcik JS, Blumenfeld PC, Marx RW, Soloway E. A collaborative model for helping middle grade science teachers learn project-based instruction. Elem School J. 1994;94:483–497. [Google Scholar]
  • 14.Thomas JW. (2000). A review of research on project-based learning. San Rafael (CA): The Autodesk Foundation. 2000. March:1–45. [Google Scholar]
  • 15.Savery J. Overview of PBL: Definitions and distinctions. Interdisciplinary Journal of Problem-Based Learning, 2006;1. [Google Scholar]
  • 16.Sungur S, Tekkaya C. Effects of problem-based learning and traditional instruction on self-regulated learning. J Educ Res. 2006;99:307–320. [Google Scholar]
  • 17.Larmer J. (2013). Project based learning vs. problem based vs. XBL. PBL Blog [Internet]. US: Buck Institute for Education. 2013 Nov- [cited 2020 May 28]. Available from https://www.pblworks.org/blog/project-based-learning-vs-problem-based-learning-vs-xbl.
  • 18.Knoll M. The project method: Its vocational education origin and international development. Journal of Industrial Teacher Education. 1997;34:59–80. [Google Scholar]
  • 19.Walker A, Leary H. A problem-based learning meta-analysis: Differences across problem types, implementation types, disciplines, and assessment levels. Interdisciplinary Journal of Problem-Based Learning. 2009;3:12–43. [Google Scholar]
  • 20.Albanese MA, Mitchell S. Problem-based learning: A review of literature on its outcomes and implementation issues. Acad Med. 1993;68:52–81. 10.1097/00001888-199301000-00012 [DOI] [PubMed] [Google Scholar]
  • 21.Newman M. A pilot systematic review and meta-analysis on the effectiveness of problem-based learning. Newcastle, UK: Learning & Teaching Subject Network; 2003. April. [Google Scholar]
  • 22.Vernon DT, Blake RL. Does problem-based learning work? A meta-analysis of evaluative research. Acad Med. 1993;68:550–563. 10.1097/00001888-199307000-00015 [DOI] [PubMed] [Google Scholar]
  • 23.Colliver JA. Effectiveness of problem-based learning curricula: Research and theory. Acad Med. 2000;75:259–266. 10.1097/00001888-200003000-00017 [DOI] [PubMed] [Google Scholar]
  • 24.Gijbels D, Dochy F, Van den Bossche P, Segers MR. Effects of problem-based learning: A meta-analysis from the angle of assessment. Rev Educ Res. 2005;75:27–61. [Google Scholar]
  • 25.Chen C-H, Yang Y-C. Revisiting the effects of project-based learning on students’ academic achievement: A meta-analysis investigating moderators. Educ Res Rev. 2019;26:71–81. [Google Scholar]
  • 26.Mergendoller JR, Maxwell NL, Bellisimo Y. The effectiveness of problem-based instruction: A comparative study of instructional methods and student characteristics. Interdisciplinary Journal of Problem-Based Learning. 2006;1:49–69. [Google Scholar]
  • 27.Mergendoller JR, Maxwell NL, Bellisimo Y. Comparing problem-based learning and traditional instruction in high-school economics. J Educ Res. 2000;93:374–382. [Google Scholar]
  • 28.Hernández-Ramos P, De La Paz S. Learning history in middle school by designing multimedia in a project-based learning experience. Journal of Research on Technology in Education. 2009;42:151–173. [Google Scholar]
  • 29.Al-Balushi SM, Al-Aamri SS. The effect of environmental science projects on students’ environmental knowledge and science attitudes. International Research in Geographical and Environmental Education. 2014;23:213–227. [Google Scholar]
  • 30.Boaler J. Open and closed mathematics: Student experiences and understandings. J Res Math Educ. 1998; 29: 41–62. [Google Scholar]
  • 31.Schneider WH, Krajcik J, Marx RW, Soloway E. Performance of students in project-based science classrooms on a national measure of science achievement. J Res Sci Teach. 2002;39:410–422. [Google Scholar]
  • 32.Fatih Ayaz M, Söylemez M. The effect of the project-based learning approach on the academic achievements of the students in science classes in Turkey: A meta-analysis study. Egit Bilim. 2015;40:255–283. [Google Scholar]
  • 33.Condliffe B, Quint J, Visher MG, Bangser MR, Drohojowska S, Saco L, et al. Project-based learning: A literature review. New York: MDRC; 2017. [cited 2020 May 28]. Available from https://www.mdrc.org/sites/default/files/Project-Based_Learning-LitRev_Final.pdf. [Google Scholar]
  • 34.Menzies V, Hewitt C, Kokotsaki D, Collyer C, Wiggins A. (2016). Project Based Learning: Evaluation Report and Executive Summary. London (UK): Education Endowment Foundation; 2016. October. [Google Scholar]
  • 35.Basque Department of Education [Internet]. Basque Country: Heziberri; c2020 [cited 2020 May 28]. Training materials. Available from http://heziberri.berritzegunenagusia.eus/1-material-didactico/.
  • 36.Willingham DT. Why students don’t like school? A cognitive scientist answers questions about how the mind works and what it means for the classroom. 1st ed. San Francisco: Jossey-Bass; 2009. [Google Scholar]
  • 37.Alacapınar F. Effectiveness of project-based learning. Egit Aras. 2008;33:17–34. [Google Scholar]
  • 38.Aral N, Kandir A, Ayhan A B, Yaşar MC. The influence of project-based curricula on six-year-old preschoolers’ conceptual development. Social Behavior and Personality: An International Journal. 2010;38:1073–1079. [Google Scholar]
  • 39.Aslan D. The effects of a food project on children’s categorization skills. Social Behavior and Personality: An International Journal. 2013;41:939–946. [Google Scholar]
  • 40.Can B, Yıldız-Demirtaş V, Altun E. The effect of project-based science education programme on scientific process skills and conceptions of kindergarten students. J Balt Sci Edu. 2017;16:395–413. [Google Scholar]
  • 41.Gültekin M. The effect of project based learning on learning outcomes in the 5th grade social studies course in primary education. Educ Sci-Theor Pract. 2005;5:548–556. [Google Scholar]
  • 42.Hastie PA, Chen S, Guarino AJ. Health-related fitness knowledge development through project-based learning. J Teach Phys Educ. 2017;36:119–125. [Google Scholar]
  • 43.Karaçalli S, Korur F. The effects of project‐based learning on students’ academic achievement, attitude, and retention of knowledge: The subject of “electricity in our lives”. School Science and Mathematics. 2014;114:224–235. [Google Scholar]
  • 44.Lin LF. The impact of problem-based learning on Chinese-speaking elementary school students’ English vocabulary learning and use. System. 2015;55:30–42. [Google Scholar]
  • 45.Zumbach J, Kumpf D, Koch SC. Using Multimedia to Enhance Problem-Based Learning in Elementary School. Information Technology in Childhood Education Annual. 2004;16:25–37. [Google Scholar]
  • 46.Çakici Y, Türkmen N. An investigation of the effect of project-based learning approach on children’s achievement and attitude in Science. TOJSAT. 2013;3:9–17. [Google Scholar]
  • 47.Kucharski GA, Rust JO, Ring TR. Evaluation of the ecological, futures, and global (EFG) curriculum: A project-based approach. Education. 2005;125:652–661. [Google Scholar]
  • 48.Ferrero M, Vadillo MA, León SP. The Theory of Multiple Intelligences into practice: A systematic review and meta-analysis. Submitted for publication 2020.
  • 49.Filippatou D, Kaldi S. The effectiveness of project-based learning on pupils with learning difficulties regarding academic performance, group work and motivation. International Journal of Special Education. 2010;25:17–26. [Google Scholar]
  • 50.Ravitz J. Summarizing findings and looking ahead to a new generation of PBL research. The Interdisciplinary Journal of Problem-based Learning. 2009;3:4–11. [Google Scholar]
  • 51.David J. L. What education says about problem-based learning. Educational Leadership. 2008;5:80–82. [Google Scholar]
  • 52.Cervantes B, Hemmer L, Kouzekanani K. The impact of project-based learning on minority student achievement: Implications for school redesign. NCPEA Education Leadership Review of Doctoral Research. 2015;2:50–66. [Google Scholar]
  • 53.Barron B, Schwartz DL, Vye NJ, Moore A, Petrosino A, Zech L, et al. , The Cognition and Technology Group at Vanderbilt. Doing with understanding: Lessons from research on problem- and project-based learning. The Journal of the Learning Sciences. 1998;7:271–311. [Google Scholar]
  • 54.Hasni A, Bousadra F, Belletête V, Benabdallah A, Nicole M-C, Dumais N. Trends in research on project-based science and technology teaching and learning at K–12 levels: A systematic review. Stud Sci Educ. 2016;52:199–231. [Google Scholar]
  • 55.Barak M, Shachar A. Projects in technology education and fostering learning: The potential and its realization. J Sci Educ Technol. 2008;17: 285–296. [Google Scholar]
  • 56.Donnelly R, Fitzmaurice M. (2005). Collaborative project-based learning and problem-based learning in higher education: A consideration of tutor and student role in learner-focused strategies. In O’Neill G, Moore S, McMullin B, editors. Emerging issues in the practice of university learning and teaching. Dublin: AISHE/HEA; 2005. p. 87–98. [Google Scholar]
  • 57.Cornell NA, Clarke JH. The cost of quality: Evaluating a standards-based design project. NASSP Bulletin. 1999;83:91–99. [Google Scholar]
  • 58.Duke NK. Project-based instruction: A great match for informational texts. American Educator. 2016;40:4–11, 42. [Google Scholar]
  • 59.Liu M, Hsiao YP. Middle school students as multimedia designers: A project-based learning approach. Journal of Interactive Learning Research. 2002;13:311–337. [Google Scholar]
  • 60.Wurdinger S, Haar J, Hugg R, Bezon J. A qualitative study using project-based learning in a mainstream middle school. Improving Schools. 2007;10:150–161. [Google Scholar]
  • 61.Barron B, Darling-Hammond L. Teaching for meaningful learning: A review of research on inquiry-based and cooperative learning [Book Excerpt]. California: The George Lucas Educational Foundation; 2008. [Google Scholar]
  • 62.Cook BG, Cook L. (2004). Bringing science into the classroom by basing craft on research. J Learn Disabil. 37, 240–247. 10.1177/00222194040370030901 [DOI] [PubMed] [Google Scholar]
  • 63.Levin B. Mobilising research knowledge in education. London Review of Education. 2011;9:15–26. [Google Scholar]
  • 64.Nelson J, Mehta P, Sharples J, Davey C. Measuring Teachers’ Research Engagement: Findings from a Pilot Study. London (UK): Education Endowment Foundation; 2017. March. [Google Scholar]

Decision Letter 0

Juan Cristobal Castro-Alonso

22 Jul 2020

PONE-D-20-21356

Is project-based learning effective among kindergarten and elementary students? A systematic review

PLOS ONE

Dear Dr. Vadillo,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we have decided that your manuscript does not meet our criteria for publication and must therefore be rejected.

Specifically:

Although the criteria have mostly been met, for example, methods and analyses are performed to a high technical standard and are described in sufficient detail (PRISMA), Criterion 4 (discussion is presented in an appropriate fashion and is supported by the data) needs major work. My suggestions for improvement before sending the proposal to another journal are: 

1) Both introduction and discussion sections should be supported by more recent papers, ideally from the WoS system. There were few articles meeting these high standards (e.g., Hasni et al., 2016). 

2) Literature predicting small effectiveness of unguided PBL should be cited to make a more compelling case supporting the results, such as:

- Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psychologist, 59(1), 14-19. doi: 10.1037/0003-066x.59.1.14

- Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86. doi: 10.1207/s15326985ep4102_1

I am sorry that we cannot be more positive on this occasion, but hope that you appreciate the reasons for this decision.

Yours sincerely,

Juan Cristobal Castro-Alonso, Ph.D.

Academic Editor

PLOS ONE

- - - - -

For journal use only: PONEDEC3

PLoS One. 2021 Apr 2;16(4):e0249627. doi: 10.1371/journal.pone.0249627.r002

Author response to Decision Letter 0


4 Dec 2020

Comment from the editor: Literature predicting small effectiveness of unguided PBL should be cited to make a more compelling case supporting the results, such as:

- Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psychologist, 59(1), 14-19. doi: 10.1037/0003-066x.59.1.14

- Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86. doi: 10.1207/s15326985ep4102_1

Response: Thank you for your suggestions. We have added both articles in the Discussion.

Comment from the editor: Both introduction and discussion sections should be supported by more recent papers, ideally from the WoS system. There were few articles meeting these high standards (e.g., Hasni et al., 2016).

Response: The systematic review run in WoS, PsycInfo, and ERIC does not yield more articles than the ones included in the previous version.

Decision Letter 1

Mingming Zhou

25 Jan 2021

PONE-D-20-21356R1

Is project-based learning effective among kindergarten and elementary students? A systematic review

PLOS ONE

Dear Dr. Vadillo,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Mar 07 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Mingming Zhou, Ph.D.

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

(1) Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

(2) Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: N/A

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: 1.Please write the year of the selection paper in both the abstract and the manuscript.

2.Please write the research sample background (outlining in which areas).

3.Please describe the originality/value of this research.

4.Please detail the effectiveness of project-based learning for kindergarten and elementary school students regarding their academic performance in the introduction.

Reviewer #2: This paper reports on an interesting topic, and I believe this research can contribute to a better understanding of project-based learning in K–6 practices. Although this paper, in general, is well written, several problems or issues in this paper need to be addressed.

1. The authors could consider using “PjBL” to refer to project-based learning throughout the manuscript, because “PBL” often refers to problem-based learning in the literature.

2. I was wondering whether PjBL is suitable for kindergarten students.

3. For the topics of effects of PjBL, I would suggest the authors to review a meta-analysis article: https://doi.org/10.1016/j.edurev.2018.11.001 , for greater understanding of its effects by cutting-edge research.

4. This research only includes 9 articles; however, it seems that the above meta-analysis included 8 elementary school PjBL studies (many are different from yours). I suggest checking those studies (to include more articles).

5. I suggest searching more databases such as EBSCOhost and ProQuest.

6. There is a gap between the 34,246 studies and the 38 studies (p. 7).

7. High resolution figures are needed; kindly provide high-resolution original figures (Figures 1–3) in the manuscript.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Decision Letter 2

Mingming Zhou

23 Mar 2021

Is project-based learning effective among kindergarten and elementary students? A systematic review

PONE-D-20-21356R2

Dear Dr. Vadillo,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Mingming Zhou, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The authors improved the article in response to the reviews. I agree to the publication of the paper submitted to the journal.

Reviewer #2: The authors have made appropriate responses and/or revisions to the comments made by the reviewers. I think the revised version of this article is ready for publication.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Acceptance letter

Mingming Zhou

25 Mar 2021

PONE-D-20-21356R2

Is project-based learning effective among kindergarten and elementary students? A systematic review

Dear Dr. Vadillo:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Mingming Zhou

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 checklist

    (DOC)

    Attachment

    Submitted filename: PONE-D-20-21356_R2_response.docx

    Data Availability Statement

    All relevant data are reported within the paper. As the present systematic review does not include a quantitative meta-analysis, there are no additional datafiles, beyond the information reported in the results section.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES