Should Reporters Use Peer Reviewed Research More Frequently?

Poverty cps

Graphs from a recent WBEZ Chicago Public Radio story on poverty and academic performance.

In a recently-published study, EdWeek researcher Holly Yettick writes that the News Media Feeds the Public a Meager Diet of Education Research. Indeed, there’s very little use of research in education coverage, and lots of it that is used doesn’t come from university-based or peer-reviewed studies.

But I’m not as sure as Yettick that this is a big problem, or that reporters are to blame for it. 

The lack of awareness and use of peer -reviewed research is a problem, writes Yettick, because parents, citizens, and even policymakers aren’t likely to consult a peer-reviewed education journal themselves. “Instead, these audiences are more likely to use news media outlets to form impressions or seek information.”

However, readers aren’t going to find much information about peer-reviewed research in what they get from education journalism. “Of the 227,095 education-related articles and other items included in my sample, less than 1 percent mentioned research,” writes Yettick. And of this 1 percent, just 3 percent “mentioned research that had appeared in peer-reviewed academic journals.” Indeed, just 16 percent of the stories that mention research come from university-based studies.

Yikes. That’s pretty pathetic. But the reasons Yettick identifies for reporters not using peer-reviewed resarch aren’t hard to predict.  They include localism, time constraints, unfamiliarity among reporters with how peer review works, and lack of specificity of findings. “Education research tends to be less conclusive and more context-driven than the physical or life sciences,” notes Yettick, perhaps understating the problem.  It’s also very slow.

Yettick admits that peer review isn’t perfect:

It is easy to find examples of poorly-conceived peer-reviewed studies and rigorous research that has never appeared in a peer-refereed journal. 

But it’s an approach that’s likely to lead to more rigorous and “important” work, she says:

Policymakers and members of the public who rely on the news media for information about education should, at the very least, be aware that, on the rare occasion that they do encounter research-based evidence in the print news media, it is not necessarily the most rigorous or important work that the field has to offer.  

Filling the void, to the extent that research even plays a role in education coverage, are studies that aren’t peer-reviewed and don’t come from university-based programs (but rather think tanks, nonprofit groups, advocacy groups, and even in-house research done by media outlets).

I’ve written about think tank-fueled journalism and the lack of expertise and quality control in think tank work is an ongoing problem.

But I think that even the best research isn’t going to convince folks to change deeply held beliefs, so I’m not sure that I’d want to hope for it to make a big difference. And I think that university-based, peer-reviewed studies that Yettick favors aren’t delivered to reporters in a timely, easily-digestible way (as they seem to be in other fields). So there’s that side of things to consider as well.

Most of all, as we learned this week with the Sesame Street study coverage, it seems clear that journalists and editors are going to misuse or simply misunderstand research whether the quality is high or low. Only some sort of fact-checking or rating system might help shape or shame media coverage.

Alexander Russo

Alexander Russo is a freelance education writer who has created several long-running blogs such as the national news site This Week In Education, District 299 (about Chicago schools), and LA School Report. He can be reached on Twitter at @alexanderrusso, on Facebook, or directly at alexanderrusso@gmail.com.