We all know that pre-existing biases play a major role in how we respond to information we come across and how we interpret it. It’s pretty much the same whether it’s an in-person encounter, something found online, or research. But, really, how can journalists guard against preconceptions, biases, and other filters that affect our responses?
That’s one of the big questions that Robin Lake addressed in a recent oped about shoddy research and lazy reporting (New Research Confirms…Everything We Already Believe):
“Findings and headlines are now available to support just about any position on any educational topic,” she writes, “and there is almost no way to understand which have merit and which do not.”
The upside of the new reality is that more research is getting out, more quickly, to larger numbers of people. But “In the context of the broader vitriol-filled environment of education reform, people have come to use studies and their associated news stories as a way to confirm their own assumptions and as weapons in ideological warfare, rather than as new knowledge to inform and challenge.”
Not one to complain without proposing some possible solutions, Lake suggests a handful of things that researchers, foundations, and news outlets can do to make things better in 2016-2017. Chief among her suggestions is “A publically available list of vetted experts on different topics could be a central resource, and reporters could be expected to use a standard quality control and transparency protocol.” She suggests that the EWA could play a role.
However, passive resources don’t seem likely to me to make much of a difference, and it’s hard to imagine EWA taking a leadership role in pushing outlets and reporters to be more responsible. But media funders — they could do some things. And calling out specific instances of problematic reporting on research — a step Lake seems hesitant to take — could also help.
According to Lake, who works at the University of Washington’s CRPE (Center on Reinventing Public Education), the situation is pretty bad when it comes to journalists’ use of research: “Some news outlets have been better than others about reporting on controversial education research, but few have been immune from inaccurate, attention-grabbing headlines or unbalanced reporting.”
Lake isn’t the only one complaining about media coverage of research. Journalism’s Role In The Current “Grit” Hype/ Criticism Cycle came up not too long ago. Former reporter and current EPE research head Holly Yettick recently advocated for education reporters to Use Peer Reviewed Research More Frequently.
Lake cites as examples a recent study from Gary Orfield’s Civil Rights Project at UCLA purporting to show higher suspension rates for charter schools. Its press release blared “Study Finds Many Charter Schools Feeding the School-to-Prison Pipeline.” However, according to Lake the report “didn’t actually provide any evidence that high suspension rates are more common in charter schools than in district-run schools.”
The other example Lake provides is media coverage of a joint Education Cities/Great Schools index that some outlets used without asking key questions.
To remedy the situation, Lake proposes journalists and researchers sign on to criteria that would make them more accountable for the work they produce: “Journalists could also help reinforce these standards if editors adopted a standard protocol for reporting on new studies.”
On the phone, Lake noted that EWA has some good resources on its site related to reporting on research. Via Twitter, EWA’s Erik Robelen shared a link to the organization’s Reporter’s Guide.
However, according to Lake, “I don’t know how much reporters actually use it.”
Realistically speaking, it’s probably not enough for reporters to have a list of researchers to call, or some steps they’re supposed to take, if there’s no one there to remind them. “Unless somebody takes on the responsibility of reminding people, pushing on, prodding people, or calling them out, not much is going to happen,” admitted Lake.
So who’s going to take on that role? Probably not EWA, which generally treats journalists with kid gloves, focusing on supporting and encouraging them.
“I can’t say at this time whether or not EWA would participate in developing such standards,” said Robelen in a followup email. “But I’m sure that if some reasonable standards were created, we’d certainly let our members know about them.”*
But maybe EWA funders might take a leading role, or funders of other journalistic efforts. If funders insisted on more responsible use of research from their media grantees, that might be a good step forward.
Funders requiring media grantees to stick with the facts and vet studies before publishing their findings would be a strong signal to the field. It also might give funders “more credibility to say ‘We stand behind the findings whatever they are,’” according to Lake.
In any case, it seems clear, reporters need to be smarter consumers of researcher findings, and especially wary of puffed-up press releases whose highlights frequently suggest results that go further than the study itself.
They also need to find better experts, rather than the most easily-accessible ones, to vet research of others. “Not all experts are created equal,” said Lake.
Hard-charging and accountability-focused as she seems, Lake doesn’t think it’s particularly helpful for reporters to look at a research institution’s previous studies for signs of bias or predisposition. “It’s too easy to put people into buckets.”)
And apparently naming names isn’t useful, either. Asked about specific outlets and reporters who are chronic abusers of research findings, Lake demurred.
“You’re trying to get me in trouble,” she said. “That’s not going to help.”
In reference to the EdCities report, her piece links to the LA School Report, now run by The Seventy Four, whose story is titled Report finds charters lead the way in closing ‘achievement gap’ in LA. ” (For more examples: Flawed EdCities “Equality” Report Coverage.)
Compliments were easier for Lake to produce.
She noted that the New York Times actually “handled the [UCLA charter school discipline] report pretty well — except for the headline.” The piece (Charter Schools Suspend Black and Disabled Students More, Study Says) was “honest about what the report could say and couldn’t They didn’t overstate the findings, or get wrapped up in the press release, and talked to people about what they meant.”
And she also praised The Seventy Four’s Matt Barnum for his approach to education research. “I think that he has been an honest broker of what the body of research says about a given topic, and has given fair critiques of new research.”
Indeed, Barnum was one of the folks who first surfaced problems with the EdCities report and its coverage.
*This response from EWA’s Erik Robelen was added to the original post.