As a longitudinal researcher, Daniel Goldhaber is used to taking the long view of education. As of March, he managed a dozen or so studies of teacher labor markets, evaluations, and factors that affect student achievement—the focus of years of analysis and millions of dollars of research funding.
The coronavirus may have blown much of that research investment off track. The pandemic has halted in-person interventions, testing, and teacher evaluations, and it’s muddied the contexts around the teacher labor markets Goldhaber studies.
“We might know who’s employed or not, but I’m not sure that I could make much of it given everything else that’s happening. In the case of student tests and teacher performance evaluations, those just won’t exist. But even if they did exist, I’d be really reluctant to draw strong conclusions about any of it,” said Goldhaber, the director of the national Center for Analysis of Longitudinal Data in Education Research.
“The bottom line is that we lose probably a full year of data—and that’s … that’s a huge deal,” said Goldhaber.
The Institute of Education Sciences, the federal Education Department’s research arm, had 400 to 500 research grants in the field when the pandemic hit the United States, according to Mark Schneider, the director of IES. Most of them will be affected in one way or another, from missing data due to closures to research assistants who are unable to work in their labs or the field because of social distancing orders in most states. For example, Goldhaber noted that many of his data privacy agreements with districts require researchers to analyze some educational data at a secure site within now-closed school buildings.
School closures and stretched staff are causing serious gaps in critical education data for this school year. Here are a few of the data collections that are lost or most at risk:
- Program for the International Assessment of Adult Competencies: The second cycle of this global assessment of the literacy, numeracy, and other skills of adults in 33 countries was set to begin its national field test data collection in April, but was canceled due to widespread school closures and stay-home orders. Other countries’ data collections have also been interrupted.
- Program for International Student Assessment: These international tests measure 15-year-old students’ reading, mathematics, and science literacy every three years. The field trial for the 2021 PISA, originally slated for this spring, was canceled. Results for the 2018 PISA for financial literacy are still expected to be released this spring, in May.
- National Assessment of Educational Progress Long-Term Assessment: Based on nationally representative samples of 9-, 13-, and 17-year-olds, these literacy and math assessments are used to track academic changes of student cohorts over time in ways that the so-called “Main NAEP” snapshot assessments cannot. A long-delayed new administration of the test was in the field this spring, but was canceled due to the widespread school closures.
- State (“Main”) NAEP: These biennial tests of reading, math, and periodic tests of other subjects are commonly dubbed the “Nation’s Report Card.” Although the next round of tests is scheduled for this fall, IES is concerned there will be delays because field proctor training was set to begin in person early this summer, and live test administration may not be possible if social distancing is still in place.
- All state grade-level assessments, exit exams, and end-of-course exams: All states have received federal accountability waivers allowing them to cancel their subject-area tests of all students in math and reading in grades 3-8 and high school. Moreover, states have generally canceled their own required tests for end-of-course exams, civics, and college-and-career-readiness in high school. These tests are used for everything from school and district accountability to teacher evaluations to research on academic achievement gaps and other issues.
Schneider said his staff is working on procedures to consider supplemental funding and grant extensions for projects that will not be able to complete their original work or timeline, “but we haven’t gotten formal requests yet, because I don’t think that people actually understand yet the full consequences. I’m going on the assumption that almost every school in the country is not going to open the rest of the year—but you know, until people know exactly what the status is in May or June, they haven’t discovered yet the size of the effect.”
Several major education research associations canceled their spring meetings in response to coronavirus outbreaks, and since have been trying to find ways to share that research and training online. Ongoing fallout from the pandemic forced the largest of these, the American Educational Research Association, to cancel both its in-person conference and an attempted virtual replacement. Now, the group plans to release a series of nine professional-development workshops originally slated for the conference but adapted to focus on how researchers should rethink and redesign their work considering the pandemic-related disruptions.
“Our focus of study is educational institutions and settings and the broader context in which that fits: the community, the family,” said Felice Levine, executive director of AERA. “But whatever you are doing … whether in the field or in the lab and across the methods and modes of inquiry, that work has been interrupted in ways that are expected given the current circumstances but will alter the nature of those studies.”
In late June, AERA plans to launch an expanded online repository of its accepted papers and presentations, as well as an “interactive presentation gallery” in which researchers would be able to present their work with recorded voiceovers and arrange times for virtual question-and-answer sessions.
‘A Terrible Data Hole’
IES faces delays or cancellations to some of the biggest, longest-running, and most-used data collections which were in the field this spring, including parts of the Nation’s Report Card, the Program for International Student Assessment, and global adult education assessments. Schneider noted that traditional in-person training for thousands of volunteers to administer tests and studies in the fall may also have to be canceled, and the research agency is trying to find alternatives to keep those later studies on track.
“I think there’s going to be kind of a terrible data hole that we’re going to have to try to fill out in a variety of ways,” said Beth Tarasawa, the executive vice president for research at the Northwest Evaluation Association, a research and testing organization. “And it’s not just a hole in the data; it’s that major events are happening in this hole.”
Several researchers and officials voiced concern that educators and school leaders are in dire need of research, on a wide variety of issues: the most effective ways teachers can connect with students emotionally, for example; the effects of parents’ involvement as home teachers on student learning; and how schools can provide guidance or mental health supports online.
In fact, the almost-overnight explosion of every kind of remote-learning and the dearth of solid evidence on how districts can approach it well and equitably is a top research priority for many experts.
“I think we’ve had a tremendous amount of knowledge about online learning and we are learning in real time a tremendous amount more,” AERA’s Levine said, adding that the closures “provide an important way for us to learn more, and to engage more explicitly in how we work with students with diverse learning modalities and what that means in online learning.”
Levine also said the cancellation of many standard assessments during the closures will provide “an opportunity to learn about different ways of assessing or having continuous feedback in ways that helps the student understand where they are, and also helps the teacher assess and improve … in a different way with other kinds of indicators than the more standard practice.”
As it stands, many districts are being forced to implement strategies that have rarely or never been tested before, such as trying to teach young children remotely. And the effects of such untested strategies need to be tested, whether they work or not.
“One of our failures as a field—education researchers—is that we don’t learn from failures,” Schneider said. “What I don’t want to do is let all these natural experiments, these naturally occurring events that create these radical transformations … just to come and go without trying to capture information about what worked, what didn’t work, why.”
“We’ve started discussions around, how fast could we be set up to make this pivot?” Schneider said. “Given our strictures and given our commitment to rigor and validity, how fast can we actually undertake some different kind of research about what is happening right this very minute that’s radically transforming education? … So if [researchers] come back and they say, ‘I can’t do a post-test because all my 4th grade classrooms are closed, are gone,’ is there a way that they could take their existing data and flip it over to a test using learning management systems? That to me is an interesting challenge.”
Tarasawa and her colleagues at NWEA are among those trying to pivot to coronavirus-era priorities. They created a quick-turnaround projection of potential learning loss under pandemic-related closures. It was adapted from an existing series of longitudinal projections of summer learning loss, based on achievement, growth, and background data from more than 5 million students who have taken NWEA’s digital adaptive test.
“We were thinking, how can we leverage what we have now to give us some idea of what we might get in the fall? Like epidemiologists have their predictions, we actually can leverage our data and reinvest it in those ways,” Tarasawa said.
Christopher Minnich, NWEA’s chief executive officer, said the group, which works about equally often with researchers and directly with school districts, “shifted our priorities immediately after the school closings” to provide new analyses about the problems districts are facing, from academic engagement and learning loss to the effects of various remote learning approaches. NWEA’s online assessments, which are typically administered in fall and winter as well as spring, have become one of the remaining datasets to look at student growth in a year when almost all state testing was canceled or interrupted.
Raj Chetty, a Harvard University economist who studies education opportunity gaps, likewise is trying to reposition his long-running work on how to help disadvantaged youth close education opportunity gaps, while also exploring new equity problems arising from the macroeconomic shock to schools and communities.
“I do think there are a bunch of people trying to think about how to leverage this as a natural experiment, where you have changes around all kinds of things from remote learning to sleep patterns, to spending more time with family, et cetera,” Chetty said. “I think the challenge there is, it’s effectively a very complex experiment because we’ve changed so many things at once. It’s going to be very hard to pin down the effect of any one channel.”
Yet Tarasawa said the necessity of research amid school closures and social distancing may be a “call to action” for education researchers to find more adaptive research methods that don’t rely only on randomized controlled trials and “implementation fidelity 100 percent as designed.”
“You know, epidemiologists and public health has been ahead of us in education for a long time and can lean into the messiness in which people behave,” Tarasawa said. “I think this is going to be that call for us that we can’t have everything perfect under all conditions. Some kids are going to be on a particular online curriculum for 30 minutes. Others are going to be on and off for an hour. How do you tease out the impact? So I do think we’re going to have to be more nimble as a community. That’s been a long time coming and this is just forcing our hand to do that.” —iStock/Getty Images Plus