Politicians and policy makers have been arguing about whether Head Start works ever since the federally-funded early childhood education program for low-income families began, in the 1960s. And yet, 50 years later, after more than 30 million children have been through the program, a new research report from a unit of the U.S. Department of Education concludes that we still don’t have much rigorous research evidence to show that Head Start is effective in preparing children for elementary school.

The July 2015 report from the What Works Clearinghouse describes how it reviewed 90 widely different studies on Head Start. Some looked at whether Head Start improves family health, for example; others at childhood obesity. Fewer than half the studies had conducted original research that assessed whether students’ academic and behavioral skills had improved. Only one of these studies passed scientific muster, and it showed rather disappointing results.  It found that Head Start had “potentially positive effects” on general reading achievement and “no discernible effects” on mathematics achievement and social-emotional development for 3-year-old and 4-year-old children.

“It doesn’t mean that Head Start is bad,” or ineffective, said Michael Lopez, an early childhood expert at Abt Associates, a research firm that conducted some of the What Works Clearinghouse research as a sub-contractor. “My colleagues might shoot me. But I’m not sure everyone is in agreement that the only way to assess Head Start is through the most rigorous evaluations.”

“We do know that there’s a larger body of work that supports the benefits of early childhood education programs,” he added, citing four other studies, including the famous Perry Preschool Study, which tracked students for 40 years after preschool. (The other three early childhood education studies were conducted in Chicago, Boston and Tulsa).

Lopez worked for the department in Health and Human Services that runs Head Start for 14 years, and he oversaw the research for Head Start. Indeed, he created and ran the 2010 Head Start Impact Study  cited by the What Works Clearinghouse as the only Head Start study that met scientific rigor. “Of course, I was ecstatic that my report was recognized,” he said. “But the bigger question is, what does this tell us?”

He explained that the What Works Clearinghouse seeks studies that resemble drug trials, where you randomly assign students to a treatment — in this case Head Start — and compare them to a control group that didn’t get the treatment. Some researchers have refused to create a control group for ethical reasons. No one wants to ban a low-income family from giving their young children an education.

In the 2010 Head Start Impact Study, the control group got messy. More than half of the families who were randomly assigned not to receive Head Start ended up in other kinds of early childhood programs, such as a preschool or a daycare center. Worse, because Head Start programs aren’t centrally administered, Lopez even found that some of his control-group families “walked across the street” and enrolled in another Head Start program.

As a result, his study didn’t compare Head Start versus staying at home, but Head Start versus a mixture of alternatives, which ranged from nothing to high-quality preschool programs. It shouldn’t surprise anyone that Head Start doesn’t necessarily beat out another good preschool program.

Furthermore, Head Start programs vary wildly throughout the country. Some are full day. Some aren’t. Some employ highly qualified teachers. Some don’t. It’s quite possible that when you average the results from programs around the whole country, as Lopez’s study did, the bad programs offset the good ones and the overall result is a wash.

Finally, there’s a lot of argument behind the scenes about how to measure children’s learning outcomes when you’re studying 3- and 4-year-olds, who can’t read or write. Standardized measures can vary with a toddler’s mood. Many kids are too shy to talk to a stranger and tell an outside evaluator what they know. Yet the alternative, asking the child’s teacher to conduct the assessment, tends to lead to inflated, biased results.

Despite the challenges, Lopez argues that researchers should continue to push for high-quality rigorous analysis of both Head Start and other early childhood programs. “Many people want greater accountability in the spending of public resources,” he said, adding that he’s currently working on a study that uses statistical techniques to prove student gains, rather than a messy control group.

I asked the Administration for Children and Families (ACF), which oversees Head Start, about its reaction to the Department of Education report. A spokesman replied that there are additional “rigorous” studies that the Department of Education overlooked, such as Head Start’s positive influence in high school graduation or in keeping people out of prison later in life. (The What Works Clearinghouse kept its review to narrower short-term outcomes, such as, reading, math and social skills).

“Despite these impacts, we realize that we need to do more to continuously improve Head Start and continue striving for stronger impacts,” Pat Fisher, the spokesperson, wrote by email.

Fisher also pointed out that major upgrades to Head Start had occurred in the past seven years, especially improving teacher qualifications. But these improvements have been too recent for researchers to produce published studies and, naturally, for the What Works Clearinghouse to review them.  The Administration is currently in the process of another revamp to Head Start, for example, proposing that all programs increase to full-day and year-round.  A public comment period on the proposed changes ends on August 18, 2015.

And in the end, even after these improvements are enacted, it may be most useful not to measure Head Start alone, but rather to assess early childhood teaching techniques that can be applied in any preschool or daycare setting.

[Cross-posted at The Hechinger Report]

Our ideas can save democracy... But we need your help! Donate Now!

Jill Barshay is the founding editor and writer of Education By The Numbers, The Hechinger Report's blog about education data.