Note
School closures and effective in-person learning during COVID-19

https://doi.org/10.1016/j.econedurev.2023.102422Get rights and content

Abstract

We document large temporal and geographical discrepancies among prominent trackers that measure in-person, hybrid, and remote schooling in the U.S. during COVID-19. We then propose a new measure of effective in-person learning (EIPL) that combines information on schooling modes with cell phone data on school visits and estimate it for a large, representative sample of U.S. public and private schools. The EIPL measure, which we make publicly available, resolves the discrepancies across trackers and is more suitable for many quantitative questions. Consistent with other studies, we find that a school’s share of non-white students and pre-pandemic grades and size are associated with less in-person learning during the 2020–21 school year. Notably, we also find that EIPL was lower for schools in more affluent and educated localities with higher pre-pandemic spending and more emergency funding per student. These results are in large part accounted for by systematic regional differences, in particular political preferences.

Introduction

The COVID-19 pandemic led many schools in the U.S. to suspend or substantially reduce in-person learning. Several organizations and research teams have developed schooling mode trackers to measure the extent of traditional, hybrid, and virtual schooling that students obtained during the pandemic. A rapidly growing literature uses these trackers to estimate the consequences of reduced in-person instruction on student enrollment and academic achievement (Dee et al., 2021, Dorn et al., 2021, Engzell et al., 2021, Goldhaber et al., 2022, Jack et al., 2022, Kogan and Lavertu, 2021, Lewis et al., 2021), COVID infection and death rates (Chernozhukov et al., 2021a, Ertem et al., 2021), as well as local labor market outcomes (Amuedo-Dorantes et al., 2023, Garcia and Cowan, 2022, Landivar et al., 2022, Prados et al., 2021).

While certainly indicative of school closures, these trackers have several drawbacks. First, as we document in this paper, the various trackers provide very different accounts of the fraction of students who spent the 2020–21 school year in either of the three schooling modes. The differences are due to how each tracker defines hybrid schooling as well as the sample of schools and source data that the trackers take into account. This poses an important challenge for researchers interested in analyzing the extent and consequences of school closures as the choice of tracker is not obvious but can substantially affect results.1

Second, even if one has good reason to prefer one tracker over the others, its usefulness for quantitative analysis is limited. This is because the trackers are qualitative — in particular, the category “hybrid” denotes an interval of possible in-person schooling days per week, reflecting the supply side of a school’s reopening policy but not the students’ take up of this option. While for some questions, one may prefer a measure of potential in-person learning (i.e. the supply side), for many other outcomes such as the ones studied in the above referenced papers, effective in-person learning (i.e. take up) is the more relevant metric. More generally, the treatments implied by the different schooling modes are not mutually exclusive. If a school is fully in-person (treatment 1), then it cannot be in hybrid mode (treatment 2). But if a school is not fully in-person, then it can be either in hybrid mode or in virtual mode. This makes it difficult to interpret regression results.2

Third, the quality and coverage of the different trackers varies by geography and time period, and the data is typically limited to county or district-level averages of public schools. This further limits the applicability of the trackers for empirical analysis.

Motivated by these issues, we propose a new measure of Effective In-Person Learning (EIPL) that we estimate by mapping anonymized cell phone data from Safegraph on visits to schools with information from schooling mode trackers. The Safegraph data is available weekly for a large, representative sample of both public and private schools. Our estimation allows for the possibility that student presence (in-person learning) and cell-phone presence (visits) at schools may not have varied 1:1 during the pandemic; that the trackers use different definitions of hybrid schooling; and that the trackers are subject to measurement error that may vary by region.3 For each school in the sample, we therefore select the estimate from the mapping between visits and tracker information with the smallest measurement error. The result is a database of weekly EIPL from March 2020 to June 2021 for more than 70,000 public and private schools. We make this database available through the online repository of the Center for Open Science at https://osf.io/cghs2/.

To illustrate the use of the data, we investigate the extent to which EIPL correlates with a host of school and local characteristics. Naturally, these correlations should not be interpreted as causal, but they provide us with a set of stylized facts to understand the factors behind school closings, and which segments of the student population were most affected. We find the following main results:

  • 1.

    EIPL during the 2020–21 school year was substantially lower in public schools than in private schools, with public charter schools ranking below public non-charter schools and private religious schools ranking above private non-religious schools.

  • 2.

    For both public and private schools, EIPL was lower in more affluent and more educated localities, and for schools with a larger share of non-white students.

  • 3.

    For public schools, EIPL is negatively related to pre-pandemic school test scores, school size, and school spending as well as Elementary and Secondary School Emergency Relief (ESSER) funding.

  • 4.

    These correlations are in large part accounted for by the school county’s share of Republican votes in the 2020 presidential election. COVID vaccination rates also predict higher EIPL, while mask requirements and teacher unionization rates predict lower EIPL.

The relation of EIPL with race, test scores, school size, and school spending confirm results previously documented by, e.g., Parolin and Lee (2021) or Landivar et al. (2022), while the relation with Republican voting preferences is consistent with Hartney and Finger (2020), Gollwitzer et al. (2020) and Valant (2020). Our contribution is to analyze these relations in a multivariate context, which reveals that systematic regional variations more so than local or school characteristics account for the large observed regional differences in public school closures. Indeed, our analysis uncovers a new nexus between income, voting preferences, and access to in-person learning: EIPL was on average lower – not higher – in more affluent localities, and this is in large part accounted for by their lower Republican vote share. Equally striking, we find that ESSER funding is on average not associated with higher EIPL even though the program was advertised primarily as support for schools to reopen for in-person learning. These findings raise critical questions about education policy during the pandemic and have potentially important implications for the impact of in-person learning loss on future educational attainment as well as income inequality.4

Besides the above mentioned studies based on schooling mode trackers, several other studies have used school visits from cell phone data, in particular Safegraph, to proxy directly for school closures during the pandemic (Bravata, Cantor, Sood, & Whaley, 2021; Chernozhukov, Kasahara, & Schrimpf, 2021b; Parolin & Lee, 2021; Garcia & Cowan, 2022; Hansen, Sabia, & Schaller, 2022). The proposed EIPL measure advances over these proxies by taking into account information from schooling mode trackers and by allowing for the relationship between school visits and in-person learning during the pandemic to be different from 1:1, which our estimates suggest is important for many cases. Furthermore, attributing cell phones to a particular location is subject to inherent measurement issues, and our analysis reveals that this leads to sparse or noisy data for a non-negligible number of schools. Accordingly, we estimate EIPL only for schools with reliable visits data.

The paper proceeds as follows. Section 2 compares the different schooling mode trackers. Section 3 describes our empirical approach for measuring EIPL. Section 4 studies the relation of EIPL with school-specific and local indicators. Section 5 concludes.

Section snippets

Comparison of schooling mode trackers

This section compares prominent schooling mode trackers for the U.S. We limit the comparison to trackers that are constructed from a direct source of information about schooling mode; e.g., school district websites or social media, public guidelines by school districts and state educational agencies, or direct surveys of schools. However, we do not impose any restrictions on geographical coverage, frequency, or granularity of the data.

In total, eight trackers fit our criteria: the //www.edweek.org/leadership/map-where-are-schools-closed/2020/07

From changes in school visits to effective in-person learning

In this section, we first describe the construction of our sample of school visit changes from Safegraph data. Then, we explain how we estimate EIPL by mapping school visit changes to information from schooling mode trackers. Since in practice, the mapping relies on having sufficient temporal variation, we prioritize trackers with data at weekly frequency; i.e., Burbio, CSDH, EdWeek, and R2L. However, the weekly CSDH data is limited to about 10,000 schools, with the other schools observed only

EIPL during the pandemic: when, where, and for whom?

Given the estimates of β, we construct EIPL for each school in our Safegraph sample and investigate the extent to which EIPL varied during the pandemic and across regions. As shown in the Appendix, EIPL dropped to between 0% and 20% for almost all counties from March to May 2020. During the 2020–21 school year, however, there are large disparities in EIPL. As illustrated in Fig. 2, EIPL recovered to 60% or higher in the South and Central North, while in the North and Mid-Atlantic and the West

Conclusion

This paper starts by highlighting important discrepancies between popular pandemic schooling mode trackers. We then propose a new measure of effective in-person learning (EIPL) that we estimate by mapping school visits data from Safegraph with tracker information from Burbio and Return2Learn. This new measure not only resolves the discrepancies across trackers, but is also more suitable for many quantitative questions about the extent and consequences of pandemic school closures We make the //doi.org/10.17605/OSF.IO/CGHS2

Declaration of Competing Interest

I declare that I have no relevant or material financial interests that relate to the research described in this paper.

Acknowledgment

I received financial support from the Social Sciences and Humanities Research Council of Canada (SSHRC grant acronym 430-2019-00967).

References (32)

  • AgostinelliFrancesco et al.

    When the great equalizer shuts down: Schools, peers, and parents in pandemic times

    Journal of Public Economics

    (2022)
  • ChernozhukovVictor et al.

    Causal impact of masks, policies, behavior on early covid-19 pandemic in the U.S.

    Journal of Econometrics

    (2021)
  • MusaddiqTareena et al.

    The pandemic’s effect on demand for public schools, homeschooling, and private schools

    Journal of Public Economics

    (2022)
  • Amuedo-DorantesCatalina et al.

    Schooling and parental labor supply: Evidence from COVID-19 school closures in the United States

    Industrial and Labor Relations Review

    (2023)
  • BravataDena et al.

    Back to school: The effect of school visits during COVID-19 on COVID-19 transmissionNBER working paper 28645

    (2021)
  • CampAndrew M. et al.

    Determinants of ethnic differences in school modality choices during the COVID-19 crisis

    Educational Researcher

    (2022)
  • ChernozhukovVictor et al.

    The association of opening K-12 schools and colleges with the spread of COVID-19 in the United States: County-level panel data analysis

    Proceedings of the National Academy of Sciences

    (2021)
  • ChettyRaj et al.

    The opportunity atlas: Mapping the childhood roots of social mobilityNBER working paper 25147

    (2020)
  • DeeThomas S et al.

    The revealed preferences for school reopening: Evidence from public-school disenrollment

    American Educational Research Journal

    (2021)
  • DornEmma et al.

    COVID-19 and education: The lingering effects of unfinished learning, vol. 27Technical report

    (2021)
  • EdunomicsLab

    National education resource database on schools

    (2021)
  • EngzellPer et al.

    Learning loss due to school closures during the COVID-19 pandemic

    Proceedings of the National Academy of Sciences

    (2021)
  • ErtemZeynep et al.

    The impact of school opening model on SARS-CoV-2 community incidence and mortality

    Nature Medicine

    (2021)
  • FahleErin M et al.

    Stanford education data archive (version 4.1)

    (2021)
  • Fuchs-SchündelnNicola et al.

    The fiscal and welfare effects of policy responses to the COVID-19 school closures

    IMF Economic Review

    (2023)
  • GarciaKairon Shayne D. et al.

    The impact of school and childcare closures on labor market outcomes during the COVID-19 pandemicNBER working paper 29641

    (2022)
  • Aseni Ariyaratne provided excellent research assistance. We thank Dennis Roche from Burbio, Nat Malkus from Return to Learn, and Safegraph for generously sharing their microdata. We also thank participants at various workshops for their comments. The data constructed in this paper is available through the online repository of the Center for Open Science: https://osf.io/cghs2/. All errors are our own.

    View full text