You might already know all about Morgan Polikoff, or he might be something of a new name to you. The professor now in his fifth year at USC has slowly but surely come onto folks’ radars, showing up on Twitter, in mainstream news stories, and on his own blog.
He’s mostly an education policy research guy, but he’s also got some things to say about education journalism that I think are interesting and useful. His views are all the more interesting since he’s also a source for journalists, and gets to see the journalism-making process from that point of view as well.
My own history with Polikoff is pretty slim. I’ve come across him here and there online over the past couple of years. His quick sense of humor is much appreciated in a world in which everyone’s super serious (or silly). It’s also notable that he’s been published by both the Shanker Institute (in 2012 and 2013, mostly) and by the Fordham Institute.
Then I met him briefly in Chicago at #EWA15 and was recently reminded of his generally constructive but still clear (and generally pro-reform) take on things last week when he penned a piece about the NYT test scoring story that I’d also written about.
So it was great to get the chance to talk to him and get a better sense of what he’s done and what he’s seeing.
As a researcher, Polikoff says he generally wants reporters — and policymakers — to make more and better use of education research that’s out there. In a phone call earlier this week, he mentions Holly Yettick’s recent work on this issue, which notes how little journalism makes use of education research.
But he’s not so much focused on peer reviewed work as she seems to be. “Peer review doesn’t mean that much. You can get garbage published in a peer reviewed publication.” Exaggerated or overly conclusive claims that reporters sometimes seem to make or imply based on available research are the big problem, according to Polikoff:
“It’s not that the reporters are getting it wrong necessarily, it’s just that sometimes they’re reporting things in ways that suggest we’re more confident about one thing or another than we really are.“
One area where Polikoff and Neerav Kingsland overlap in what they see looking at education journalism is the sense that journalists aren’t working closely enough with available data — not just education outcomes but also survey and public opinion data.
The media “gets it quite wrong sometimes” when it comes to polling results, according to Polikoff, especially when reporters latch onto a result that jumps out rather than focusing on the most consistent, trending results from multiple surveys. The result that doesn’t fit is “the one you should trust the least” but sometimes gets the most attention, he says.
Good news on this front is that Polikoff is involved with some public opinion polling that USC is doing and hopes to do some more writing about how the polls work and how reporters might make best use of the results that come back. He’s done some guest writing about poll results for Fordham and on his own blog in the past.
In particular, he hopes to show us all how we can take a “Nate Silver approach” to understanding polls, looking at them in the aggregate or over time rather than in isolation, and teasing out how differences in questions can create the appearance of major differences in public opinion that may not be so big in real life.
More help for journalists and others in understanding poll results would be a great thing, considering what a public opinion desert that most education journalism seems to operate in through most of the year. Other than when the surveys come out, you rarely see it referenced in education stories — even when it’s relevant to the news being discussed. (The one exception being the chestnut about parents feeling more confident about their own children’s schools and teachers than the district, state, or national education system over all.)
Polikoff notes that journalists try to present both viewpoints on a controversial issue even when data available suggests that there isn’t really an even divide in public opinion. “Typically journalists want to present both or all sides of the story,” he notes. “It’s relatively rare that they would present a side but then indicated in the story that poll data suggests this is a minority view.”
Rare, yes — but wouldn’t it be helpful?
More help with poll data could help inform stories in ways they often aren’t, and help reporters figure out whom to interview and what balance of anecdotes or perspectives make most sense to include.
My wish list in this area also includes some sort of central repository of polling data on education so that it’s all in one easy place, comparable from one poll to another and one year to another. And, given how little most of the public knows about education, but how strongly everyone likes to express their views, it would also be really fun if someone went out and asked folks Jay Leno- Jimmy Kimmel-style what they thought about Common Core being cancelled, or mandatory sex ed in kindergarten via Race to the Top, or any other issue.
Polikoff is on some reporters’ call lists for stories about national education issues, and he generally enjoys the process and the results. In particular, he admires EdWeek for the careful and balanced work that they do. “I learn a lot from them,” he says. When there are multiple stories on the same topic, “theirs is often the most balanced, the most accurate.”
Like many expert sources, however, he finds it frustrating when reporters spend lots of time with him getting up to speed or checking out their thinking but then don’t quote him. Or, when he does get quoted, it comes out in a way that feels out of context.
Frustrated about how his quotes had come out in an NPR story earlier this year he tweeted: “Sometimes, even when they record your words, you still end up with a lousy, out-of-context quote. I need to work on making that impossible.”
Sometimes, even when they record your words, you still end up with a lousy, out-of-context quote. I need to work on making that impossible.
— Morgan Polikoff (@mpolikoff) March 12, 2015
Related posts: A Nagging Disconnect Between Vivid Anecdotes & Underlying Data (Kingsland), Should Reporters Use Peer Reviewed Research More Frequently? (Yettick).