Skip to main content
Elsevier - PMC COVID-19 Collection logoLink to Elsevier - PMC COVID-19 Collection
. 2023 Jun 14;164:103133. doi: 10.1016/j.jdeveco.2023.103133

The impact of the COVID-19 pandemic on children’s learning and wellbeing: Evidence from India

Andrea Guariso a,b,c,d,, Martina Björkman Nyqvist e,d
PMCID: PMC10264163  PMID: 37342545

Abstract

We study the impact of the COVID-19 pandemic and associated school closure on primary school children’s learning and mental wellbeing in Assam, India. Using a comprehensive dataset that tracked and repeatedly surveyed approximately 5000 children across 200 schools between 2018 and 2022, we find that children lost the equivalent of nine months of learning in mathematics and eleven months in language, during the pandemic. Children lacking resources and parental support experienced the largest losses. Regular practice, teacher interaction, and technology were associated with less learning loss. Over the same period, children’s psychological wellbeing improved. Our research provides valuable insights for designing post-emergency programs.

Keywords: COVID-19, School closure, Primary school, Learning loss, Psychological wellbeing, India

1. Introduction

The COVID-19 pandemic led to unprecedented disruption of school systems across the world. Between March 2020 and March 2022, virtually every government closed schools and suspended in-person teaching in an attempt to contain the spreading of the COVID-19 virus (Our World in Data, 2022).1 UNICEF estimates that more than 1.6 billion children worldwide experienced education loss due to school closures, despite efforts from governments and schools to substitute in-class lessons with remote teaching practices (UNICEF, 2021a). However, we still have a limited understanding of how school closures affected students’ learning and wellbeing, how these effects varied across students, and which learning practices proved most effective in cushioning the adverse effects. Shedding light on these dimensions is of utmost importance in the post-emergency era for designing effective programs to sustain recovery and help students catch up on lost learning.

India provides a relevant case study because of the drastic policies implemented during the COVID-19 emergency, which affected hundreds of millions of students.2 Schools across the country closed for one and a half years even though only about 25% of the Indian students had access to digital devices and internet connectivity at home, meaning that the vast majority of them were not equipped to join any remote digital learning initiatives (UNICEF, 2021b).

In this paper, we use a unique dataset that tracks 200 primary schools and about 5000 children in rural Assam, in northern India, over five years (from 2018 until 2022) to study the impact of the

COVID-19 pandemic and the associated school closures on children’s learning outcomes and psychological wellbeing. By leveraging a study that started in 2018, we can address several challenges related to the estimation of learning losses in the context of a common shock, such as a global pandemic.3 Our analysis is based on standardized language and mathematics tests, which were independently administered in a consistent way across three survey rounds (two before and one after the pandemic) to measure the academic performances of the same students over time, irrespective of whether they remained in school or not.4 Our panel sample consists of 200 schools and 4998 students tracked throughout the five-year study period. In 2022 we also added 1533 new students enrolled in the lowest grades to perform a richer comparison of students from the same grade and school before vs after the pandemic.

Our first key finding is that the COVID-19 pandemic and associated school closure had a large negative impact on primary school children’s learning levels: by 2022, children lost 0.30σ (standard deviations) in mathematics and 0.39σ in language compared to children in the same grade and school in 2019. These estimates correspond to nine and eleven months of lost learning in the two subjects, respectively. We observe similar drops when using the panel sample and studying same students’ learning trajectories over time.

We then expand our analysis in two directions. Firstly, we use child and household data collected prior to the pandemic to identify which children suffered the largest learning losses. Our results indicate that a child’s ability to learn during the pandemic heavily depended on their access to resources and support at home: learning losses (particularly in language) were more severe for children who were already behind academically, came from lower socio-economic backgrounds, had (younger) siblings at home, and whose parents had lower aspirations for them and underestimated their ability. Second, we study which resources and activities helped children sustain learning while schools were closed. We employ a standard value-added production function and use lagged test scores and inputs as proxies for omitted inputs and latent ability. Our results reveal that teachers’ phone calls, regular weekly practice, and the use of technology (mobile phone and internet) provided the strongest support for learning in both language and mathematics. Additionally, we find that private tuition proved to be an effective means of sustaining learning in language.

Finally, we study how the pandemic affected students’ psychological wellbeing by relying on a standardized survey tool that we validate in our setting. Our results show that, on average, psychological wellbeing improved over the pandemic. This means that learning and psychological wellbeing evolved in opposite directions, despite the strong positive correlations that we observe cross-sectionally between these two dimensions. Results are consistent across measures and sample definitions — i.e., considering the same students over time or comparing children in the same grades and schools before vs after the pandemic.

Our study contributes to the recent literature on the impact of the COVID-19 pandemic on children’s learning outcomes. Two recent reviews by Moscoviz and Evans (2022) and Patrinos et al. (2022) identify, respectively, 29 and 35 studies that estimated learning losses across different settings and report an average drop of −0.17σ. Most of the existing evidence stems from high-income countries, is based on repeated cross-sections of students, and relies on student tests performed in schools.5 In terms of setting and data quality, our study is more related to the recent work by Singh et al. (2022), who study primary school students of the same age and village in Tamil Nadu (India) before and after the pandemic, finding losses of 0.7σ in mathematics and 0.34σ in language. We contribute to this literature in multiple ways. In our study we track and independently survey students at multiple points in time before (two rounds) and after (one round) the pandemic. The two pre-pandemic rounds enable us to measure changes in students’ learning trajectories. Moreover, the panel dimension and the richness of our data enable us to expand the analysis in two directions: first, we identify pre-pandemic child and household characteristics (including parental aspirations and support) that are associated with the largest learning losses; second, we study which resources and learning practices that students might have used during school closure are associated with smaller learning losses. In doing so, we also contribute to the literature on the drivers of learnings in low-income countries (e.g. Keane et al., 2022) by focusing on a period when schools were closed, and students developed new learning practices. Finally, we go beyond learning outcomes and study the impact of the pandemic on students’ psychological wellbeing. A rich literature, spanning across fields, studies how to measure wellbeing among children (see Pollard and Lee, 2003 for a review). In recent years there has been growing interest in the link between wellbeing and schooling, reflected in the inclusion of socio-emotional variables in the well-known PISA learning assessment system (OECD, 2017). Existing studies, however, mainly focus on high-income countries (e.g. Govorova et al., 2020). We contribute to this literature by validating a recently developed survey tool and investigating the relationship between learning and wellbeing in a low-income setting both in regular times (i.e. before the pandemic) and after a large shock (i.e. immediately after the pandemic).

2. Study context and design

2.1. The education system and COVID-19 emergency in Assam

The setting for our study is the state of Assam, in northern India (Figure A.1). Primary education is compulsory, starts at age 6, and lasts for eight grades, divided into two blocks: lower primary (grades 1 to 5) and upper primary (grades 6 to 8). Primary school children automatically progress to the next grade (Government of India, 2009). In the pre-pandemic era, primary school enrollment in Assam was nearly universal (97.4%) and on par with the Indian average (95.9%). Learning outcomes were instead well below official targets, even when compared to the rest of the country: only 40.1% of children enrolled in grade 5 could read a grade-2 text (the Indian average was 50.3%), and only 17.8% could solve divisions (the Indian average was 27.8%) (ASER, 2018).

In March 2020, the COVID-19 emergency led the Indian government to close its 1.5 million schools. Assam was no exception, and between March 2020 and March 2022, schools remained closed for 15 months, with only short reopening intervals between COVID waves.6 While schools were formally expected to provide remote support, data shows that only 39.4% of students in Assam received any learning material from their schools, with WhatsApp being the most common channel, followed by in-person visits (ASER, 2021). The ASER report also shows that families tried to cope with the school closure in multiple ways. The share of children with a smartphone at home almost doubled from 36.1% in 2018 to 71% in 2021 — although only about half of the students could access it for learning purposes. Tuition became more common during the emergency but remained a privilege that less than a third of students (29.1%) could enjoy. Overall, the primary source of support during school closure came from within the household, as 70.5% of students in Assam received help from family members. Moreover, traditional learning activities remained the most prevalent form of learning at home (62.6%), while only 17.6% of the students reported using online resources, and a mere 7.2% reported using broadcasted activities (ASER, 2021).

2.2. Data collection

The sample for this study is based on a project that started in 2018 to study the impact of an educational program implemented by the NGO Pratham (Björkman Nyqvist and Guariso, 2022). The first data collection took place in mid-2018 and covered a sample of 5726 children enrolled in grades 1 to 4 across 200 primary public schools.7 We individually tested each child in mathematics and language and surveyed them on their study habits. We also surveyed a representative sample of mothers (or primary caregivers whenever the mother was not available), covering 80% of the sample, asking questions on children’s learning habits and household characteristics. We refer to this data collection round as the 2018 sample. A second data collection round took place between October 2019 and January 2020 with the same sample of students and mothers.8 This survey mirrored the first one in content and structure, except for the addition of a psychological wellbeing module to measure students’ personal and school-related wellbeing (more details below). We refer to this data collection round as the 2019 sample.

Two months after completing the 2019 data collection, the COVID-19 pandemic became a global threat, and schools closed. Between February and March 2021, when the COVID-19 emergency was still ongoing, we conducted a short phone-based data collection with school principals and mothers to learn about ongoing teaching and learning practices.9 We refer to this phone survey as the 2021 sample. Finally, as soon as field activities could resume, between January and March 2022, we conducted a third in-person data collection round, tracking and surveying all students again.10 For this last survey round, we also added a new set of students enrolled in grades 2 and 3 in 2022. We refer to this final data collection round as the 2022 sample.

All three in-person survey rounds (2018, 2019, and 2022) followed the same protocol, surveying and testing each child individually, either in school or at home, using trained enumerators that spoke the local language. The learning test included two parts, each with a mathematics and language component. The first part mirrored the standard ASER test conducted yearly by the ASER Center across India for children aged 5 to 16.11 The second part was based on extensively piloted questions used in other studies in India (Muralidharan et al., 2019).12 A core set of questions remained the same across all rounds, while a subset was changed to avoid repetition. In the analysis, we follow Jacob and Rothstein (2016) and aggregate all mathematics and language questions in two indexes, using a combination of two-parameter logistic (2PL) and three-parameters logistic (3PL) item response theory (IRT) model on the pooled sample to account for the presence of both open and multiple-choice questions. This procedure allows us to use the complete set of questions, using the overlapping questions for common normalization.

2.3. Sample

Our panel sample originates from the 5726 children enrolled in grades 1 to 4 at the time of the first survey in 2018.13 We successfully tracked back and surveyed 5328 (93%) of them in 2019 and 4998 (87%) in 2022, when they reached grades 4 to 7.14

In 2022, we added 1533 new children enrolled in grades 2 and 3. Our repeated cross-sectional sample consists of cohorts enrolled in the same grade and school at different points in time. For this analysis, we will typically restrict the sample to children in grades 2, 3, and 4, as those are the grades covered across all three survey rounds.15

Out of the representative sample of 4592 mothers surveyed in 2018, we successfully tracked back and surveyed 4303 (94%) of them in 2019, while in the 2021 phone-based survey we only reached 1878 (41%) of them.

Table A.1 summarizes information from the different data collection rounds, while Table A.2 reports key summary statistics on children and mothers included in the sample. Finally, Table A.3 compares the characteristics of children tracked over time vs. lost at follow-up.

3. Results

3.1. Learning loss

Fig. 1 illustrates the evolution of learning levels in mathematics and language over the study period. Panel A considers the full sample of students and shows the learning profiles of test scores with respect to age (in completed years) at the time of testing, separately for the three different survey rounds (2018, 2019, and 2022). Learnings are expressed in terms of the scores resulting from the IRT model that combines all answers. While the 2018 and 2019 lines show significant overlaps, the 2022 line is much lower, indicating that in 2022 children were performing well below prepandemic levels. More specifically, the lines indicate that, on average, children’s learning levels in mathematics and language in 2022 were comparable to the level achieved by children one year younger, prior to the pandemic. Panel B provides an alternative representation that exploits the panel dimension of the data. Here we restrict the focus to tracked children and illustrate the evolution of their learnings during the 17 months between the 2018 and 2019 data collection rounds (red line) and during the following 27 months between the 2019 and 2022 data collection rounds (gray line). On the horizontal axis, we report the percentile of their learning level at time t (either 2018 or 2019), and on the vertical axis, we report the average monthly progress in learning between time t and time t+1 (either 2019 or 2022). The figures show that on average student’s mathematics (language) learning during the pandemic progressed at a monthly rate corresponding to only 46% (36%) of the average monthly rate estimated in the pre-pandemic period. Overall, Fig. 1 shows that during the pandemic, children experienced large learning losses – equivalent to almost one year of learning – compared to the level they should have reached in normal circumstances, and this is the case for every point of both the age and the test score distributions.

Fig. 1.

Fig. 1

Learning levels over time.

Notes: Learning levels are expressed in terms of the score resulting from the item response theory (IRT) model that combines all test answers. Fig. 1(a) presents the distribution of learning levels with respect to age (in completed years) at the time of test-taking, across the three survey rounds (we exclude ages with few observations). Fig. 1(b) only considers the panel sample of children that were tracked from 2018 until 2022 and shows the evolution of their learning levels in-between survey rounds. On the horizontal axis, we report the percentile of their learning level at time t (either 2018 or 2019), and on the vertical axis, we report the average monthly progress in learning between time t and time t+1 (either 2019 or 2022).

To precisely quantify these losses, we use the repeated cross-sectional dataset and compare the learning levels of students enrolled in the same school and grade before vs after the pandemic. This comparison is possible for children enrolled in grades 2–4, as these grades were covered in all survey rounds. We standardize our learning outcome measures with respect to 2019, i.e. the last pre-pandemic survey round.

Table 1 reports the estimates based on the following empirical model:

yi,s,t=β12019t+β22022t+ΛXi,s,t+ρg+θs+μi,s,t (1)

where yi,s,t is the learning outcome for child i, enrolled in school s, at time t, with t{2018,2019,2022}; 2019t and 2022t are indicators for the 2019 and 2022 data collection rounds, respectively; Xi,s,t is a vector of individual controls that include gender and age, and ρg and θs are grade and school fixed effects, respectively.16 Standard errors are clustered at the school level. The two coefficients β1 and β2 tell us, respectively, the average difference in the outcome between the 2018 and 2019 data collection rounds (17 months) and between the 2018 and 2022 data collection rounds (44 months), conditional on the other variables included in the model. By comparing the two coefficients, we learn the difference between the 2019 and 2022 rounds (27 months).

Table 1.

The impact of COVID-19 on learning outcomes.

Mathematics
Language
(1) (2) (3) (4) (5) (6)
2019 0.106 0.101 0.086 0.106 0.094 0.071
[0.020] [0.037] [0.026] [0.020] [0.037] [0.026]
2022 −0.198 −0.254 −0.187 −0.288 −0.275 −0.276
[0.026] [0.039] [0.032] [0.028] [0.040] [0.035]
2019 × Grade 3 −0.014 0.032
[0.050] [0.052]
2019 × Grade 4 0.028 0.003
[0.045] [0.050]
2022 × Grade 3 0.028 −0.046
[0.054] [0.058]
2022 × Grade 4 0.119 0.006
[0.050] [0.052]
2019 × Girl 0.039 0.069
[0.031] [0.033]
2022 × Girl −0.022 −0.022
[0.040] [0.041]

Schools FE
Grade FE
Diff 2022 vs 2019 −0.30 −0.36 −0.27 −0.39 −0.37 −0.35
p-val(Diff 2022 vs 2019) 0.00 0.00 0.00 0.00 0.00 0.00
Grades 2–4 2–4 2–4 2–4 2–4 2–4
Observations 11,293 11,293 11,293 11,293 11,293 11,293

Notes: The sample is restricted to children enrolled in grades 2 to 4 in the three in-person data collection rounds (2018, 2019, or 2022). The dependent variable is the test score in mathematics (columns 1–3) or language (columns 4–6), obtained by combining all test questions through the item response theory (IRT) model on the pooled sample. Test scores are normalized using the mean and standard deviation for students in grades 2–4 in 2019. The p-values at the bottom of the table refer to the test of the null hypothesis of equal change in test scores in 2019 and 2022. All regressions control for gender and age. Standard errors clustered at the school level are reported in squared brackets below the coefficients. There are 200 schools in the sample.* p<0.1, ** p<0.05, *** p<0.01.

Results in columns 1 and 4 show that before the pandemic, between the 2018 and 2019 data collection rounds, students of the same school and grade improved in mathematics and language by 0.11σ. This progress reflects the fact that in 2018 we tested students towards the middle of the school year, while in 2019 we tested them at the end of it. In 2022, we again surveyed and tested students towards the end of the school year, and we estimate a 0.20σ drop in mathematics and a 0.29σ drop in language compared to 2018. When we compare the 2019 and 2022 estimates, which are based on data collected at similar points of the academic year, we obtain a learning deficit of 0.30σ in mathematics and 0.39σ in language (we report the difference at the bottom of the table). To put these numbers in perspective, in 2019 the average difference in test scores across grades was 0.38σ in mathematics and 0.43σ in language. This means that the estimated learning losses correspond to nine months of lost education in mathematics and eleven months of lost education in language (consistent with what we observed in Fig. 1). The learning deficit in mathematics (but not in language) is slightly smaller for higher grades (columns 2 and 5), while we find no differential effects across gender (columns 3 and 6).

A possible reason for these sizeable average learning losses is that children might abandon schooling during the pandemic and never return. However, in line with the findings from ASER (2021), we do not find evidence of a spike in dropouts over the pandemic: only 1.1% of our original sample dropped out of school by 2022.17 Even school attendance, which we recorded during unannounced survey days, remained relatively stable: from 68% in 2018 and 75% in 2019 to 65% in 2022.

Table 2.

The heterogeneous impact of COVID-19 on learning outcomes.

Mathematics
Language
(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12)
2022 −0.180 −0.245 −0.238 −0.130 −0.208 −0.381 −0.417 −0.406 −0.345 −0.166 −0.297 −0.290
[0.026] [0.043] [0.034] [0.043] [0.030] [0.046] [0.030] [0.042] [0.034] [0.043] [0.030] [0.043]
2022 × Knowledge > median −0.028 0.199
[0.033] [0.038]
2022 × Wealth > median 0.074 0.224
[0.058] [0.059]
2022 × Highest education in HH > Primary 0.237 0.367
[0.077] [0.070]
2022 × Has older sibling 0.031 −0.091
[0.057] [0.053]
2022 × Has younger sibling −0.167 −0.180
[0.049] [0.056]
2022 × Parental aspirations (PCA) 0.006 0.054
[0.019] [0.021]
2022 × Overestimate ability 0.462 0.406
[0.059] [0.064]
2022 × Underestimate ability −0.244 −0.054
[0.120] [0.072]

Schools FE
Grade FE
Grades 4–5 4–5 4–5 4–5 4–5 4–5 4–5 4–5 4–5 4–5 4–5 4–5
Observations 5,347 4,134 4,135 5,249 4,173 4,131 5,347 4,134 4,135 5,249 4,173 4,131
p-val(2022+2022*(...)) 0.00 0.00 0.98 0.04 0.00 0.04 0.00 0.00 0.73 0.00 0.00 0.02
p-val(2022+2022*younger sibling/underestimate) 0.00 0.00 0.00 0.00

Notes: The sample is restricted to children enrolled in grades 4 and 5 in 2019 or 2022. All children included in this sample were surveyed in the 2019 data collection round (children enrolled in grades 4 and 5 by 2022 were enrolled in grades 2 and 3 in 2019), and all variables considered for the interaction were collected before the pandemic. The dependent variable is the test scores in mathematics (columns 1–6) or language (columns 7–12), obtained by combining all test questions through the item response theory (IRT) model on the pooled sample. Test scores are normalized using the mean and standard deviation for students in grades 4–5 in 2019. Knowledge refers to the learning level in mathematics or language in 2019 and the indicator used in the second row takes value one if the student had a learning level above the median for his/her grade. Wealth is generated through principal component analysis (PCA) combining 21 asset and ownership variables. Parental aspiration is generated through principal component analysis (PCA) combining 3 questions: “What is the highest education you would ideally like your child to complete?”, “What is the highest education you think your child will actually complete?”, “How likely is it on a scale of 1-10 that your child will achieve your aspiration?”. Overestimates and Underestimates variables are obtained by comparing the actual learning level of the student in the ASER test in 2019 and the level predicted by the caregiver for the same test. All regressions control for gender and age. The p-values at the bottom of the table refer to the test of the null hypothesis of no difference in the outcome in 2022 for the group identified by the interaction. Standard errors clustered at the school level are reported in squared brackets below the coefficients. There are 200 schools in the sample. * p<0.1, ** p<0.05, *** p<0.01.

3.2. Heterogeneity in learning loss

We use data collected before the pandemic to understand who suffered the largest learning losses during the pandemic period while schools were closed. For this exercise, we focus on the cross-sectional sample and restrict the comparison to students enrolled in grades 4 and 5 in the 2019 and 2022 samples, as these are the comparable groups for which we have pre-pandemic information. We estimate the following empirical model:

yi,s,t=α12022t+α22022t×Ci,s,2019+ΓCi,s,2019+λg+κs+μi,s,t (2)

where we interact the 2022 indicator with a range of variables collected pre-pandemic.

Results reported in Table 2 show that learning losses were particularly pronounced among children who were low-performing academically, came from households that are poorer and with lower levels of education, had siblings (especially younger ones), and whose mothers had lower aspirations for their future18 and underestimated their ability.19 The coefficients are large and precisely estimated for language and generally consistent for mathematics, although in this case, only the household’s education, the presence of (younger) siblings, and the mother’s knowledge of the child’s ability are statistically significant at conventional levels. At the bottom of the table, we report the test for the null hypothesis of no difference in learning between 2019 and 2022 for the group identified by the interaction. Children from households where at least one member achieved secondary education and whose mothers overestimated their ability suffered no discernible loss in learning over the pandemic period, neither in mathematics nor in language.

These findings indicate that during the long spell of school closure, children’s ability to sustain learnings heavily depended on the resources and support available at home. In particular, they highlight the role of parental attitudes and perceptions: where parents displayed confidence in their child’s ability, either directly through higher aspirations for their future or indirectly by overestimating their skills, children better sustained their learnings through the pandemic period. Notably, with the exception of parental over-estimation of a child’s ability, none of these dimensions played any systematic role in the evolution of children’s learning between 2018 and 2019, before the pandemic ( Table A.5): their relevance emerged at a time when schools were closed, and family became the primary source of support for teaching and learning.

3.3. The impact of coping strategies

The early data collection rounds (2018 and 2019) included questions on children’s study and learning practices. In 2021 and 2022, we enriched the surveys to capture learning investments and practices students engaged in while schools were closed. We use this data to understand which investments and activities worked best in sustaining children’s learning during the emergency.20 We run a value-added production function, where omitted inputs and latent ability are proxied by previous test scores, collected at two points in time, and by earlier learning investments (e.g. Fiorini and Keane, 2014, Keane et al., 2022, Andrabi et al., 2022). More specifically, we estimate the following empirical model:

yi,s,2022=γ1yi,s,2019+γ2yi,s,2018+γ3LPi,s,t+ΠLi,s,t+λg+κs+μi,s,2022 (3)

Where y indicates our usual learning outcome measures, LPi,s,t indicates a set of investments or learning practices children could engage in while schools were closed (e.g. taking tuitions or using a smartphone to study) that we recorded in the 2021 or 2022 survey rounds, and Li,s,t includes gender and age, as well as a set of control from pre-pandemic surveys: household’s wealth, household’s education, and previous study practices (whether the student was taking tuition, whether the student studied with friends after school, whether the student participated in study groups).21 Our focus is on the coefficient γ3, which provides the estimated average test score gain (or loss) for students that engaged in learning practice LP during school closure, after accounting for observable factors. It is an unbiased estimate conditional on the controls being rich enough to account for the sorting of children into that specific learning practice. While this is a strong assumption, we believe the controls at our disposal are richer than in most of the previous literature and rich enough to account for the most plausible sources of sorting (i.e. past achievements, family background, and previous learning habits).

Table 3 reports the list of investments and learning practices we captured in our data, indicating their prevalence across our sample (column 1). Columns 3 and 4 show the estimated coefficient γ3for mathematics and language outcomes, respectively. Results are generally consistent across the two subjects and show that regular interactions with teachers through mobile phones, regular weekly practice, and the use of technology (phone and internet) for studying were associated with higher learning during school closure. Private tuitions were also associated with higher learning, especially in language. We do not find instead evidence that the simple availability of learning material, the fact that the school got in touch with the family, or the support from siblings and other family members were associated with higher learning. Column 2 shows that the learning practices and investments associated with largest gains were – except for private tuition – significantly more common among children in more educated households, which helps explain why we did not observe any drop in learning for these children.

Table 3.

The benefit of learning practices and investments during school closure.

Mean Difference Value added Value added
high vs low Mathematics Language

HH educ


(1) (2) (3) (4)
Panel A: In-person Child Survey
In touch with teachers (any mean) 0.38 0.11*** 0.061** 0.033
(0.02) (0.023) (0.021)
______phone calls 0.23 0.11*** 0.096*** 0.054**
(0.02) (0.026) (0.023)
______text messages 0.05 0.04*** −0.032 0.017
(0.01) (0.044) (0.040)
______in person visits 0.19 0.02 0.056* 0.001
(0.02) (0.029) (0.024)
Learning activity every week 0.19 0.05*** 0.082** 0.046*
(0.02) (0.029) (0.024)
Mobile phone to study 0.27 0.19*** 0.138*** 0.120***
(0.02) (0.026) (0.021)
Internet to study 0.25 0.18*** 0.122*** 0.106***
(0.02) (0.028) (0.023)
N. of schools 200 200
Observations 3,856 3,856
Panel B: Phone Mothers Survey
Teaching/learning material available (any) 0.57 0.02 0.036 0.001
(0.03) (0.033) (0.030)
______Whatsapp 0.08 0.08*** 0.112 0.008
(0.02) (0.071) (0.047)
______School text, work books 0.36 −0.04 0.033 −0.007
(0.03) (0.039) (0.031)
______Educational programs on TV/Radio 0.02 0.03*** −0.050 −0.009
(0.01) (0.097) (0.085)
Tuitions 0.28 −0.05* 0.052 0.102***
(0.03) (0.039) (0.030)
School in touch at least every other week 0.21 0.11*** 0.019 −0.044
(0.03) (0.050) (0.036)
Study support from parents 0.57 0.12*** 0.037 0.023
(0.03) (0.037) (0.030)
Study support from siblings/other family 0.31 0.01 0.011 0.015
(0.03) (0.036) (0.029)
N. of schools 184 184
Observations 1,823 1,823

Notes: The sample is restricted to the panel sample of children that were tracked from 2018 until 2022. Panel A considers variables taken from the 2022 in-person child survey. Panel B considers variables taken from the 2021 phone survey administered to mothers. The table reports the overall mean (column 1), as well as the difference in mean (and its standard error) between children that live in a household where the highest attained education level is secondary school or higher vs other children (Column 2). Column 4 and 5 present the value added of each item on test-scores in Mathematics (column 3) and Language (column 4), estimated using regression (3) from the main text. The regression controls for test score in 2018, test score in 2019, gender, age, grade fixed effects, school fixed effects, wealth index (obtained combining 21 variables from the 2018 survey), highest education level in the HH, whether the student was taking tuition in 2019, whether the student participated in study groups after school in 2019, whether the student ever studied with friends after school in 2019. Standard errors clustered at the school level are reported in brackets below the coefficients. There are 200 schools in the full sample. * p<0.1, ** p<0.05, *** p<0.01.

3.4. Psychological wellbeing

In 2019 and 2022, we administered to all students a psychological wellbeing module based on the Child and Adolescent Social and Personal Assessment of Wellbeing (CAPSAW). The CAPSAW is a recently developed tool designed for children 4 to 18 years old, which has already been tested and validated across different contexts (Symonds et al., 2022). The original tool comprises four separate domains, each covered by eight questions, which are then combined in an index through principal component analysis. We included in the survey the two domains relevant to our study: personal and school-related wellbeing. We perform several checks to validate the measures in our setting. First, we estimate Cronbach’s alpha (Cronbach, 1951), which is the most common index of internal consistency of a test and find it to be well above the usual 0.7 threshold (e.g. Laajaj and Macours, 2019). Second, we show that the measures strongly correlate with alternative variables that we would typically expect to be associated with school-related satisfaction and wellbeing. Finally, we show that across the two survey rounds, the measures maintained consistent correlations with a set of pre-determined covariates, such as age and gender, suggesting no systematic changes in how students answered the questions. Appendix C contains a more detailed description of the tool, the survey items, and the validation checks.

We consider both the panel sample, which allows us to control for all individual time-invariant characteristics, and the cross-sectional sample, which allows us to compare children enrolled in the same school and grade before vs after the pandemic.22 In the latter case, we restrict the comparison to children in grades 2 to 5, as they are the grades covered both in 2019 and 2022. To ease the interpretation of our results, we standardize the wellbeing measures using the 2019 average and standard deviation. Results are reported in Table 4 and are consistent across measures and samples: children’s psychological wellbeing significantly improved in the post-pandemic period compared to the pre-pandemic period. This result is in stark contrast with the large drops in learning we documented above and means that children’s learning and psychological wellbeing moved in opposite directions over the pandemic period. Interestingly, this is also in contrast with the strong positive correlation that we observe between these two dimensions when we look at the pre-pandemic survey round, even after controlling for a range of potential mediating factors ( Table A.6). To put numbers in perspective, the average improvement in wellbeing between 2019 and 2022 reported in column 3, corresponds to the improvement associated with moving from the 5th to the 92th percentile of learning scores in mathematics within the 2019 sample.

Table 4.

The impact of COVID-19 on psychological wellbeing.

Personal School-related Personal School-related
wellbeing
wellbeing
wellbeing
wellbeing
(1) (2) (3) (4)
2022 0.605 0.426 0.445 0.393
[0.089] [0.081] [0.026] [0.026]

Individual FE
Schools FE
Grade FE
Data Panel Panel Cross-section Cross-section
Grade 2–7 2–7 2–5 2–5
Observations 9,834 9,834 9,749 9,749

Notes: In columns 1 and 2 the sample is restricted to children that were surveyed in both the 2019 and 2022 data collection rounds, i.e. children enrolled in grades 2 to 5 by 2019, who therefore moved to grades 4 to 7 by 2022 (panel sample). In columns 3 and 4 the sample is restricted to children enrolled in grades 2 to 5 in 2019 or 2022 (repeated cross-section). The dependent variables are the personal and school-related wellbeing indexes, each obtained by combining through Principal Component Analysis (PCA) eight questions from the Child and Adolescent Social and Personal Assessment of Wellbeing (CAPSAW). More details on these measures and their validations are reported in Appendix C. The variables are normalized using the mean and standard deviation across the sample in 2019. All regressions control for gender and age. Standard errors clustered at the school level are reported in squared brackets below the coefficients. There are 200 schools in the sample. * p<0.1, ** p<0.05, *** p<0.01.

Our results indicate that, as children spent more time at home, their psychological wellbeing improved over the pandemic period. Such improvement was equally spread across gender, wealth, and any other dimension we checked within our data (results available from the authors).

4. Conclusion

This paper provides novel evidence of the consequences of the COVID-19 pandemic on primary school children’s learning levels and mental wellbeing.

Our results show that the pandemic had a large negative impact on children’s learning. Over a 27-month period, students experienced a loss equivalent to nine and eleven months of learning in mathematics and language, respectively. The school closures shifted more educational responsibilities onto families, and our results indicate that children from homes with relatively fewer resources and support fell behind the most. Additionally, our results highlight the role played by parental aspirations and confidence in their child’s ability, which are dimensions that have received little attention in previous literature, but became particularly crucial during a time when children spent more time at home.

Our results also unveil the regressive learning impact of the pandemic, which exacerbated the learning gap associated with different socio-economic conditions. We find that this widening gap can be partly ascribed to the different investments and coping strategies adopted by families: children in higher-educated households were more likely to keep in touch with their teachers, to do regular practice, to use technology for learning, and to receive parental support, which we show were among the activities associated with less learning losses.

We also find that children’s psychological wellbeing proved remarkably resilient and, in fact, improved during the pandemic. While acknowledging the challenge of measuring mental wellbeing, especially among young children, the fact that we relied on an existing tool that we validated in our context and that our results are consistent across different samples and specifications brings credibility to our findings. While the literature has so far highlighted the negative consequences of the pandemic on psychological wellbeing, most of the evidence comes from high-income settings, and focuses on adults and adolescents (e.g. Salari et al., 2020; Cenat et al., 2021; Thorisdottir et al., 2023). There is still limited evidence on the evolution of children’s mental wellbeing during the pandemic, especially from low-income settings. Although some studies suggest overall worsening mental health, there seems to be significant variation across groups and locations (see Samji et al., 2022 for a review). Our findings are broadly consistent with the existing evidence from Pakistan (Baranov et al. (2022)) and the UK (Department of Education, 2020, Department of Education, 2021), documenting no overall worsening in children’s psychological wellbeing in 2020, and with the documented drop in teen suicides during school closure in the US (Hansen et al., 2022).

Our paper provides insights that are relevant to the design of educational policies in the post-emergency era. The dramatic learning losses that we estimated call for a substantial revision of school curricula, whose priority should be to ensure that children at every level can build back their foundational skills. It will be crucial to account for the vast heterogeneity in the impact of the pandemic and ensure that children with fewer resources and support at home are not left behind. The good news is that sustained school enrollment and mental wellbeing make it possible for schools and teachers to reach students and help them get back on track with their learning. Regarding longer-term implications, our results highlight the crucial role that technology and families play in supporting children’s learning. Governments should boost their efforts to reduce the technological divide (within our sample, only 27% of students had access to a mobile phone to study, and 25% had access to the internet)23 and sensitize families on the added value they can provide to their children’s education: where mother’s support and confidence in their child was relatively higher, the child performed better.

Our analysis suffers from a few limitations. First, we focus on primary education, which is compulsory in India. Further research is needed to understand the impact of the pandemic on secondary and higher education. Second, our post-pandemic round was collected soon after school reopened, right after the peak of the emergency. We, therefore, cannot say anything about the trajectory of the recovery. Future data collection efforts are essential for understanding the longer-run consequences of the pandemic and for studying recovery dynamics, along the lines of Singh et al. (2022).

CRediT authorship contribution statement

Andrea Guariso: Conceptualization, Methodology, Formal analysis, Writing – original draft, Visualization, Project administration, Funding acquisition. Martina Björkman Nyqvist: Conceptualization, Methodology, Formal analysis, Writing – review & editing, Project administration, Funding acquisition.

Footnotes

We gratefully acknowledge editor Tom Vogl and two anonymous referees for valuable comments and suggestions. We also thank Abhijeet Singh and seminar participants at JPAL CaTCH Initiative COVID-19 Event, and Stockholm School of Economics seminars for comments and suggestions, as well as Paola Giannattasio and Fadhil Nadhif Muharam for invaluable research assistance. We thank J-PAL South Asia and its staff, specifically Bhavani Kumara Masillamani and Sathia Chakrapani for their support and management of the data collection and fieldwork. Finally, we thank Rukmini Banerji and Saveri Kulshreshth at Pratham India for insightful discussions about the education system and the impact of COVID-19 in Assam. All mistakes are our own. Financial support from Carl Bennet AB, J-PAL South Asia at IFMR’s Cash Transfers for Child Health, J-PAL Post-Primary Education Initiative (PPE-1843), Swedish Research Council (2016-05615), and Mistra (the Swedish Foundation for Strategic Environmental Research) is greatly appreciated. The original study received ethical approval from IFMR Human Subjects Committee (IRB00007107) and Trinity College Dublin Ethic Review Board (05062018).

1

According to Our World in Data, schools at all levels closed in 179 (97%) of the 185 countries included in the database. The remaining 6 countries either required school closure only at some levels or recommended school closure without clear enforcement.

2

India currently hosts 360 million people under the age of 14, which corresponds to more than 18% of the entire world population of that age bracket (World Bank, 2022).

3

Given the global nature of the shock, learning losses typically need to be estimated through before vs after comparisons. For such comparisons to be reliable, one needs comparable tests, administered and assessed in the same way, and covering a comparable set of students. This makes in-schools surveys problematic if, for instance, the pandemic pushed children out of school, or teachers became more (or less) generous with marks once schools reopened.

4

In this paper, for simplicity, we refer to 2022 as the period after the pandemic, as it corresponds to the time when India relaxed their emergency policies.

5

There are some relevant exceptions that focus on middle- and low-income settings. Alasino et al. (2023) and Lichand et al. (2022) use administrative data to estimate learning losses in Mexico and Brazil, respectively, while Ardington et al. (2021) exploit longitudinal data from three different studies in South Africa.

6

Primary schools in Assam closed down three times: March to December 2020; May to October 2021; January to February 2022. Figure A.2 illustrates these closure windows, together with the evolution of COVID cases in the state.

7

The target villages were randomly selected from a larger list of schools in Nagaon district that the NGO identified as eligible for the expansion of its activities, based on accessibility, size, and potential for community mobilization. Appendix B provides more details about the sample and compares study schools and households to the rest of Assam.

8

Attrition between the 2018 and 2019 survey rounds is 7% for both students and mothers. Tracked children were on average slightly younger, were more likely to be girl, and had better test scores at baseline Table A.3.

9

Despite our best efforts, the phone-based data collection only covered 41% of the original caregivers’ sample. We were more likely to reach relatively wealthier households, with younger children, while we do not observe selection in terms of test scores or psychological wellbeing Table A.3.

10

Attrition between the 2018 and 2022 survey rounds is 12.7%. Also in this case, tracked children were typically younger, more likely to be girls, and had better test scores at baseline. However, these differences were mostly driven by attrition between the 2018 and 2019 survey rounds: when considering children that dropped out from the study between 2019 and 2022, there is no differences in test scores Table A.3.

11

See www.asercentre.org/ for more details.

12

See Björkman Nyqvist and Guariso (2022) for more details.

13

The original study (Björkman Nyqvist and Guariso, 2022) is based on a randomized controlled trial with four different study arms, each one including 50 schools. In the analysis here we consider the full sample of 200 schools, always controlling for treatment status (through school fixed effects). All our results are confirmed, although in some cases less precisely estimated, when we restrict the focus to the 50 “control” schools (see Appendix B for details).

14

Up until 2020, the school year in Assam followed the solar year and ran from January to December. In May 2020 the government decided to transition to the more common school year running from April to March.

15

The 2018 survey covered children enrolled in grades 1–4, the 2019 survey covered grades 2–5, and the 2022 survey covered grades 2–7. One caveat is that, while the panel sample was selected at baseline by looking at school enrollment registries, and children were tracked at home whenever not present in class, in 2022, due to limited resources, the new sample of children in grades 2 and 3 was only surveyed if they were attending class on survey day. Our findings on enrollment and attendance suggest that this is unlikely to have a major impact on our estimates and in Table A.4 we show that our estimates remain very similar when we restrict the analysis only to children that were attending schools on survey days in previous rounds as well — although we cannot rule out that the type of children attending class changed over the pandemic period.

16

School fixed effects always refer to the 200 baseline schools included in the sample. In principle, one might be concerned that grade is endogenous. However, as explained above, within our context there is automatic grade progression and according to the statistics provided by the Ministry of Education there is no primary grade repetition across Assam. As expected, all our results remain unaffected by replacing grade fixed effects with age fixed effects (results available from the authors).

17

It is possible that children that dropped out from school were more difficult to track. In 2022 we managed to gather additional information for a subset of 48 children lost at follow-up. Out of these, the majority had moved to another village and thus enrolled in another school, while 16 (34%) were reported to have dropped out. If we were to apply this ratio to the full set of children lost at follow-up, dropout rate across our sample would increase to 3.1%.

18

We measure aspirations through an index that combines answers to the three following questions through principal component analysis: “What is the highest education you would ideally like [child name] to complete?”; “What is the highest education you think [child name] will actually complete?”; “How likely is it on a scale of 1–10 that [child name] will achieve your aspiration?”.

19

We do not find instead any clear differential effects across children that had mobile phone at home or with higher levels of personal or school-related psychological wellbeing (not reported).

20

The two data sources complement each other: the 2021 survey includes the broadest set of questions, which we administered to caregivers by phone, but suffers from high attrition, while the 2022 survey was administered to all students in person. As mentioned above, although attrition in the phone survey was non-random – respondents were relatively wealthier, higher educated, and with younger children than non-respondents – we find no systematic attrition in terms of key dimensions such as test scores and psychological wellbeing (Table A.3).

21

Despite losing some observations due to missing answers, our results are robust to considering richer sets of controls for: (1) household characteristics (number of household members, presence of older sibling(s), presence of younger sibling(s), highest level of education in the household, indicator for belonging to the scheduled caste); (2) pre-pandemic study habits (whether the student studied with parents, whether the student read after school, number of days spent studying in a week, whether the child received incentives to attend school); (3) pre-pandemic aspirations and motivations (the highest level of education the student wanted to achieve, parental aspiration index (PCA), whether the mother overestimated/underestimated the child’s learning level). Results are available from the authors.

22
For the repeated cross-sectional sample, we estimate a regression similar to (1) above, where we only consider two survey rounds and replace the learning outcome with a measure of psychological wellbeing. For the panel sample, we estimate instead the following empirical model:
yi,t=δ12022t+ρi+μi,t
where ρi indicates child-specific fixed effects.
23

Our results highlight the role technology can play in enabling access to educational opportunities and training at a time when in-person learning is not possible. Existing evidence on the impact of technology in ”normal” times is mixed, ranging from no effect of interventions that simply provided computers to households or schools (e.g. Beuermann et al., 2015) to large gains in learning from interventions that technology to deliver personalized instruction (Muralidharan et al., 2019).

Supplementary material related to this article can be found online at https://doi.org/10.1016/j.jdeveco.2023.103133.

Appendix. supplementary material

The following is the Supplementary material related to this article.

MMC S1

Supplementary material with Additional Figures and Tables (A), Restricted Sample (B), Psychological Wellbeing Measurement (C).

mmc1.pdf (1.1MB, pdf)

Data availability

Data will be made available on request.

References

  1. Alasino E., Ramirez M.J., Romero M., Schady N., Uribe D. Mimeo; 2023. Learning Losses from COVID-19 School Closures: Evidence from Mexico. [Google Scholar]
  2. Andrabi, T., Bau, N., Das, J., Khwaja, A.I., 2022. Heterogeneity in School Value-Added and the Private Premium. NBER Working Paper, 30627.
  3. Ardington C., Wills G., Kotze J. COVID-19 learning losses: Early grade reading in South Africa. Int. J. Educ. Dev. 2021;86 [Google Scholar]
  4. ASER . Pratham Organization; 2018. Annual Status of Education Report (Rural) 2018: Technical Report. [Google Scholar]
  5. ASER . Pratham Organization; 2021. Annual Status of Education Report (Rural) 2021: Technical Report. [Google Scholar]
  6. Baranov V., Grosjean P., Khan F.J., Cenat S.W. The impact of covid-related economic shocks on household mental health in pakistan. Health Economics. 2022;31:2208–2228. doi: 10.1002/hec.4571. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Beuermann D.W., Cristia J., Cueto S., Malamud O., Cruz-Aguayo Y. One Laptop per Child at home: Short-term impacts from a randomized experiment in Peru. Am. Econ. J. Appl. Econ. 2015;7(2):53–80. [Google Scholar]
  8. Björkman Nyqvist M., Guariso A. Mimeo; 2022. Supporting Learning In and Out of School: Experimental Evidence from India. [Google Scholar]
  9. Cenat J.M., Blais-Rochette C., Kokou-Kpolou C.K., Noorishad P.G., Mukunzi J.N., McIntee S.E., Dalexis R.D., Goulet M.A., Labelle P.R. Prevalence of symptoms of depression, anxiety, insomnia, posttraumatic stress disorder, and psychological distress among populations affected by the COVID-19 pandemic: A systematic review and meta-analysis. Psychiatry Res. 2021;295 doi: 10.1016/j.psychres.2020.113599. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Cronbach L.J. Coefficient alpha and the internal structure of tests. Psychometrika. 1951;16(3):297–334. [Google Scholar]
  11. Department of Education . Department of Education; United Kingdom: 2020. State of the Nation 2020: Children and Young People’s Wellbeing. [Google Scholar]
  12. Department of Education . Department of Education; United Kingdom: 2021. State of the Nation 2021: Children and Young People’s Wellbeing. [Google Scholar]
  13. Fiorini M., Keane M. How the allocation of children’s time affects cognitive and noncognitive development. J. Labor Econ. 2014;32(4):787–836. [Google Scholar]
  14. Government of India . Ministry of Education, Government of India; 2009. Right of Children to Free and Compulsory Education (RTE) Act. [Google Scholar]
  15. Govorova E., Benitez I., Muñiz J. How schools affect student well-being: A cross-cultural approach in 35 OECD countries. Front. Psychol. 2020;11(431) doi: 10.3389/fpsyg.2020.00431. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Hansen, B., Sabia, J.J., Schaller, J., 2022. In-Person Schooling and Youth Suicide: Evidence from School Calendars and Pandemic School Closures. NBER working paper, 30795.
  17. Jacob B., Rothstein J. The measurement of student ability in modern assessment systems. J. Econ. Perspect. 2016;30(3):85–108. [Google Scholar]
  18. Keane M., Krutikova S., Timothy N. Child work and cognitive development: Results from four low to middle income countries. Quant. Econ. 2022;13:425–465. [Google Scholar]
  19. Laajaj R., Macours K. Measuring skills in developing countries. J. Hum. Resour. 2019;56(4):1254–1295. [Google Scholar]
  20. Lichand G., Doria C.A., Leal-Neto O., Fernandes J.P.C. The impacts of remote learning in secondary education during the pandemic in Brazil. Nat. Hum. Behav. 2022;6(8):1079–1086. doi: 10.1038/s41562-022-01350-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Moscoviz, L., Evans, D., 2022. Learning Loss and Student Dropouts during the COVID-19 Pandemic: A Review of the Evidence Two Years after Schools Shut Down. Center for Global Development Working Paper, 609.
  22. Muralidharan K., Singh A., Ganimian A.J. Disrupting education? Experimental evidence on technology-aided instruction in India. Amer. Econ. Rev. 2019;109(4):1426–1460. [Google Scholar]
  23. OECD . OECD Publishing; Paris: 2017. PISA 2015 Results (Volume III): Students’ Wellbeing. [Google Scholar]
  24. Our World in Data . 2022. Coronavirus Pandemic (COVID-19) Published online at OurWorldInData.org. [Google Scholar]
  25. Patrinos, H.A., Vegas, E., Carter-Rau, R., 2022. An Analysis of COVID-19 StudentLearning Loss. Policy Research Working Paper Series, 10033.
  26. Pollard E.L., Lee P.D. Child Wellbeing: A systematic review of the literature. Social Indic. Res. 2003;61:59–78. [Google Scholar]
  27. Salari N., Hosseinian-Far A., Jalali R., Vaisi-Raygani A., Rasoulpoor S., Mohammadi M., Rasoulpoor S., Khaledi-Paveh B. Prevalence of stress, anxiety, depression among the general population during the COVID-19 pandemic: A systematic review and meta-analysis. Glob. Health. 2020;16(57) doi: 10.1186/s12992-020-00589-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Samji H., Wu J., Ladak A., Vossen C., Stewart E., Dove N., Long D., Snell G. Review: Mental health impacts of the COVID-19 pandemic on children and youth - A systematic review. Child Adolesc. Ment. Health. 2022;27(2):173–189. doi: 10.1111/camh.12501. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Singh, A., Romero, M., Muralidharan, K., 2022. COVID-19 Learning Loss and Recovery: Panel Data Evidence from India. RISE Working Paper 22/112.
  30. Symonds J.E., Sloan S., Kearns M., et al. Developing a social evolutionary measure of child and adolescent hedonic and eudaimonic Wellbeing in rural Sierra Leone. J. Happiness Stud. 2022;23:1433–1467. [Google Scholar]
  31. Thorisdottir I.E., Agustsson G., Oskarsdottir S.Y., Kristjansson A.L., Asgeirsdottir B.B., Sigfusdottir I.D., Valdimarsdottir H.B., Allegrante J.P., Halldorsdottir T. Effect of the COVID-19 pandemic on adolescent mental health and substance use up to March, 2022, in Iceland: A repeated, cross-sectional, population-based study. Lancet Child Adolesc. Health. 2023;7(5):347–357. doi: 10.1016/S2352-4642(23)00022-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. UNICEF . 2021. Impact of COVID-19 on poor mental health in children and young people tip of the iceberg. UNICEF press release 04 2021. [Google Scholar]
  33. UNICEF . 2021. COVID-19: Schools for more than 168 million children globally have been completely closed for almost a full year. says UNICEF. UNICEF press release 02 2021. [Google Scholar]
  34. World Bank . The World Bank Group; 2022. World Development Indicators. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

MMC S1

Supplementary material with Additional Figures and Tables (A), Restricted Sample (B), Psychological Wellbeing Measurement (C).

mmc1.pdf (1.1MB, pdf)

Data Availability Statement

Data will be made available on request.


Articles from Journal of Development Economics are provided here courtesy of Elsevier

RESOURCES