Flawed EdCities “Equality” Report Coverage Highlights Importance Of Skepticism & Scrutiny

It certainly wasn’t exactly confidence-inspiring to find out yesterday that an Education Cities/Great Schools report on school inequality issued last week had some serious problems that had largely gone unnoticed.

Wait, what? You mean that mainstream news outlets are covering and sharing out reports and studies like this without asking tough questions or checking them out first? 

Yep. That’s the unfortunate situation. It happens.  How many other times it’s happened without coming to light, we do not know. The planet does not appear to have stopped spinning. But there are some ways that we can try — and should — try to prevent or limit this kind of thing in the future. And there are several good ways to avoid passing along information that reporters can use — including a gut sense of concern and being too busy doing other things.

“I was wavering — it was clearly getting a lot of attention and was an important topic, and just as clearly required a lot of caveats and explanations,” recalls Ann Doss Helms, from the Charlotte Observer. “I could try to claim I made an informed judgment to hold off, but honestly, I just ran out of time doing other things.”

It’s not hard to see why there was so much interest in the report: “This is the first time that we are able to fairly compare the size of the achievement gap at the school, city, and state levels across state lines,” Education Cities’ Christine Schneider explained at the time. “We also have the biggest collection of student proficiency data divided by subgroup in the nation.”

The email sent to me came with the headline “New York City’s Achievement Gap Is Smaller than 90 Percent of Major U.S. Cities, New Index Confirms.” The email went on to tell me that New York City “is home to a higher level of education equality than nearly every large major city in America, including Chicago, Los Angeles and Washington, DC.”

That combination of something “new,” plus localized information that can be compared to other places — plus results that seem counter intuitive (either good or bad) — is catnip for reporters and social media alike.

And so it was no surprise that the report got widespread coverage. Some examples include EdSource (Achievement gaps in Irvine, San Francisco are among smallest of US cities), the News & Observer (Achievement gap widened in NC more than any other state, according to index), and the Stockton Record (Stockton’s achievement gap wider than many). The Huffington Post and Chalkbeat Colorado also covered it. The report was covered by roughly 40 news outlets, according to the report authors.

However, concerns began to bubble up on Twitter, including from The Seventy Four’s Matt Barnum, Mathematica’s Stephen Glazerman, and others. Why did states with higher poverty rates do better on the equality index?, asked Barnum and others.  Rutgers’ Bruce Baker Tweeted that the Equality Index “may just be least meaningful/useful/valid ‘equality’ ‘index’ I’ve seen in a long time.”

The result of these concerns was a story from The Seventy Four titled Education Cities and GreatSchools to Admit Flaw in Statewide Rankings of School Inequality: “The index it released last week ranking the school inequality gap in 100 cities and 35 states was faulty in its state comparisons.” Some additional concerns about the study can be found in this Gadfly blog post from Colorado’s Van Schoales.

3 THINGS THAT COULD HAVE BEEN DONE

What could reporters have done to avoid passing along any misinformation? Three main things pop up from conversations with journalists:

(1) A healthy dose of skepticism and bravery is always helpful:

Some reporters did ask questions, according to a source familiar with the rollout of the report.  And major national outlets like the New York Times, Washington Post, AP, and NPR seem to have held back from jumping in.

“My editors forward me any number of cockamamie fake listicle “studies” that claim to have found the 45 most whatever colleges/schools/private schools, because they think a story will get a million hits,” wrote an education reporter who didn’t want to be named. “It’s my self-imposed responsibility to vet and say we can’t publish this.”

(2) Taking a careful look at the source of the information can be tremendously helpful, too

“For me, I look at the reputation of an organization before I decide to pitch a story about it,” wrote Daveen Rae Kurutz,  a former education reporter turned database reporter. By that method alone, Kurutz would have been disinclined to cover the report. “I’m not a fan of a lot of the way Great Schools develops its methodologies – I’ve been burned by them before, because they don’t take into consideration the nuances of individual areas. And that’s pretty much so what happened here.”

“I had a feeling there was something odd about this study when I looked into ‘about’ the group and it just didn’t feel academic enough,” wrote one reporter. “I tweeted it but now I regret even that.”

(3) Some sort of vetting of the report in-house or by an independent researcher is also great — and can be done fairly quickly.

“I look at their methodology – and most times, I try to replicate it actually, with county level data,” according to Kurutz. “If it looks solid and based in fact and not some ridiculously flawed methodology then I’ll push for a story. If it is ridiculously flawed and paints an erroneous picture that is relevant to our readers, I’ll pitch a story as well – just not the one that the organization wants.”

“I do think it’s clearly a challenge for journalists to make sense of releases like this,” said USC associate professor Morgan Polikoff. “I think in general it’s good practice to run either research or this kind of thing (which is not research) by other academics before writing about it, but obviously that’s easier said than done.”

Good thing that a nonprofit called STATS offers a free ‘Help for Journalists’ service. “Basically, if you are wondering if a study/report/data set is valid or have questions about it and want to run them by a mathematician/statistician, who is a member of the American Statistical Association, you just send an email via the above link,” emailed Carey Reed, who works for the nonprofit that runs the service.

The Education Writers Association also offers some general guidance about working with research here: Reporter Guide.

Reporters can also find experts using the EWA list serve if they don’t have contacts themselves. 

HOW OUTLETS RESPONDED TO THE CORRECTION

The folks at Huffington Post, EdSource, and ChalkbeatCO did not reply to emails asking them for comment.

However, at least some of the outlets that published the original story have written about the problems that were encountered. The News & Observer noted, for example that the report that “ranked North Carolina as the state with the worst widening achievement gap backtracked Tuesday, saying that their method was “’not the best way to compare states.’”

Related posts: When Mainstream Media Passes Along MisinformationDespite Occasional Errors, FiveThirtyEight Still a Helpful AdditionNCES Critiques Achieve Report Comparing State Scores to NAEP.

Alexander Russo

Alexander Russo is a freelance education writer who has created several long-running blogs such as the national news site This Week In Education, District 299 (about Chicago schools), and LA School Report. He can be reached on Twitter at @alexanderrusso, on Facebook, or directly at alexanderrusso@gmail.com.