You have /5 articles left.
Sign up for a free account or log in.

The National Center for Education Statistics has just five employees left, former staffers say.
Photo illustration by Justin Morrison/Inside Higher Ed | Matt_Brown/iStock/Getty Images | Markus Spiske/rawpixel
Five years after the COVID-19 pandemic first forced schools and colleges into remote learning, researchers, policymakers and higher education leaders may no longer have access to the federal data they need to gather a complete picture of how those disruptions have affected a generation of students long term—or hold states and colleges accountable for the interventions they deployed to address the fallout.
That’s because the National Center for Education Statistics, the Education Department’s data-collection arm that’s administered surveys and studies about the state of K-12, higher education and the workforce since 1867, is suddenly a shell of itself.
As of this week, the NCES is down to five employees after the department fired nearly half its staff earlier this week. The broader Institute of Education Sciences, which houses NCES, also lost more than 100 employees as part of President Donald Trump’s campaign to eliminate alleged “waste, fraud and abuse” in federal funding.
The mass firings come about a month after federal education data collection data took another big blow: In February, the department cut nearly $900 million in contracts at IES, which ended what some experts say was critical research into schools and fueled layoffs at some of the research firms that held those contracts, including MDRC, Mathematica, NORC and Westat.
Although Trump and his allies have long blamed COVID-related learning loss on President Joe Biden’s approval of prolonged remote learning, numerous experts told Inside Higher Ed that without some of the federal data the NCES was collecting, it will be hard to draw definitive conclusions about those or any other claims about national education trends.
‘Backbone of Accountability’
“The backbone of accountability for our school systems begins with simply collecting data on how well they’re doing. The fact that our capacity to do that is being undermined is really indefensible,” said Thomas Dee, a professor at Stanford University’s Graduate School of Education and research associate at the National Bureau of Economic Research. “One could conceive this as part of an agenda to undermine the very idea of truth and evidence in public education.”
But the Education Department says its decision to nearly eliminate the NCES and so many IES contracts is rooted in what it claims are the agency’s own failures.
“Despite spending hundreds of millions in taxpayer funds annually, IES has failed to effectively fulfill its mandate to identify best practices and new approaches that improve educational outcomes and close achievement gaps for students,” Madi Biedermann, deputy assistant secretary for communications at the department, said in an email to Inside Higher Ed Thursday.
Biedermann said the department plans to restructure IES in the coming months in order to provide “states with more useful data to improve student outcomes while maintaining rigorous scientific integrity and cost effectiveness.”
But many education researchers disagree with that characterization of IES and instead view it as an unmatched resource for informing higher education policy decisions.
“Some of these surveys allow us to know if people are being successful in college. It tells us where those students are enrolled in college and where they came from. For example, COVID impacted everyone, but it had a disproportionate impact on specific regions in the U.S. and specific social and socioeconomic groups in the U.S.,” said Taylor Odle, an assistant professor of educational policy studies at the University of Wisconsin at Madison.
“Post-COVID, states and regions have implemented a lot of interventions to help mitigate learning loss and accelerate learning for specific individuals. We’ll be able to know by comparing region to region or school to school whether or not those gaps increased or reduced in certain areas.”
Without uniform federal data to ground comparisons of pandemic-related and other student success interventions, it will be harder to hold education policymakers accountable, Odle and others told Inside Higher Ed this week. However, Odle believes that may be the point of the Trump administration’s assault on the Education Department’s research arm.
“It’s in essence a tacit statement that what they are doing may potentially be harmful to students and schools, and they don’t want the American public or researchers to be able to clearly show that,” he said. “By eliminating these surveys and data collection, and reducing staff at the Department of Education who collect, synthesize and report the data, every decision-maker—regardless of where they fall on the political spectrum—is going to be limited in the data and information they have access to.”
Scope of Data Loss Unclear
It’s not clear how many of the department’s dozens of data-collection programs—including those related to early childhood education, college student outcomes and workforce readiness—will be downsized or ended as a result of the cuts. The department did not respond to Inside Higher Ed’s request for clarity on exactly which contracts were canceled. (It did confirm, however, that it still maintains contracts for the National Assessment of Educational Progress, the College Scorecard and the Integrated Postsecondary Education Data System.)
A now-fired longtime NCES employee who asked to remain anonymous out of fear of retaliation said they and others who worked on those data-collection programs for years are still in the dark on the future of many of the other studies IES administers.
“We’ve been out of the loop on all these conversations about the state of these studies. That’s been taking place at a higher level—or outside of NCES entirely,” said the terminated employee. “What these federal sources do is synthesize all the different other data sources that already exist to provide a more comprehensive national picture in a way that saves researchers a lot of the trouble of having to combine these different sources themselves and match them up. It provides consistent methodologies.”
Even if some of the data-collection programs continue, there will be hardly any NCES staff to help researchers and policymakers accurately navigate new or existing data, which was the primary function of most workers there.
“We are a nonpartisan agency, so we’ve always shied away from interpreting or making value judgments about what the data say,” the fired NCES worker said. “We are basically a help desk and support resource for people who are trying to use this data in their own studies and their own projects.”
‘Jeopardizing’ Strong Workforce
One widely used data set with an uncertain future is the Beginning Postsecondary Students Longitudinal Study—a detailed survey that has followed cohorts of first-time college students over a period of six to eight years since 1989. The latest iteration of the BPS survey has been underway since 2019, and it included questions meant to illuminate the long-term effects of pandemic-related learning loss. But like many other NCES studies, data collection for BPS has been on pause since last month, when the department pulled the survey’s contract with the Research Triangle Institute.
In a blog post the Institute for Higher Education Policy published Wednesday, the organization noted that BPS is intertwined with the National Postsecondary Student Aid Study, which is a comprehensive nationwide study designed to determine how students and their families pay for college and demographic characteristics of those enrolled.
The two studies “are the only federal data sources that provide comprehensive insights into how students manage college affordability, stay enrolled and engaged with campus resources, persist to completion, and transition to the workforce,” Taylor Myers, assistant director of research and policy, wrote. “Losing these critical data hinders policy improvements and limits our understanding of the realities students face.”
That post came one day after IHEP sent members of Congress a letter signed by a coalition of 87 higher education organizations and individual researchers urging lawmakers to demand transparency about why the department slashed funding for postsecondary data collection.
“These actions weaken our capacity to assess and improve educational and economic outcomes for students—directly jeopardizing our ability to build a globally competitive workforce,” the letter said. “Without these insights, policymakers will soon be forced to make decisions in the dark, unable to steward taxpayer dollars efficiently.”
Picking Up the Slack
But not every education researcher believes federal data is as vital to shaping education policy and evaluating interventions as IHEP’s letter claims.
“It’s unclear that researchers analyzing those data have done anything to alter outcomes for students,” said Jay Greene, a senior research fellow in the Center for Education Policy at the right-wing Heritage Foundation. “Me being able to publish articles is not the same thing as students benefiting. We have this assumption that research should prove things, but in the world of education, we have very little evidence of that.”
Greene, who previously worked as a professor of education policy at the University of Arkansas, said he never used federal data in his assessments of educational interventions and instead used state-level data or collected his own. “Because states and localities actually run schools, they’re in a position to do things that might make it better or worse,” he said. “Federal data is just sampling … It’s not particularly useful for causal research designs to develop practices and interventions that improve education outcomes.”
Other researchers have a more measured view of what needs to change in federal education data collection.
Robin Lake, director of the Center on Reinventing Public Education at Arizona State University, has previously called for reforms at IES, arguing that some of the studies are too expensive without enough focus on educators’ evolving priorities, which as of late include literacy, mathematics and how to handle the rise of artificial intelligence.
But taking a sledgehammer to NCES isn’t the reform she had in mind. Moreover, she said blaming federal education data collections and researchers for poor education outcomes is “completely ridiculous.”
“There’s a breakdown between knowledge and practice in the education world,” Lake said. “We don’t adopt things that work at the scale we need to, but that’s not on researchers or the quality of research that’s being produced.”
But just because federal education data collection may not focus on specific interventions, “that doesn’t mean those data sets aren’t useful,” said Christina Whitfield, senior vice president and chief of staff for the State Higher Education Executive Officers Association.
“A lot of states have really robust data systems, and in a lot of cases they provide more detail than the federal data systems do,” she said. “However, one of the things the federal data provides is a shared language and common set of definitions … If we move toward every state defining these key elements individually or separately, we lose a lot of comparability.”
If many of the federal data collection projects aren’t revived, Whitfield said other entities, including nonprofits and corporations, will likely step in to fill the void. But that likely won’t be a seamless transition without consequence.
“At least in the short term, there’s going to be a real issue of how to vet those different solutions and determine which is the highest-quality, efficient and most useful response to the information vacuum we’re going to experience,” Whitfield said. And even if there’s just a pause on some of the data collections and federal contracts are able to resume eventually, “there’s going to be a gap and a real loss in the continuity of that data and how well you can look back longitudinally.”