They’re all familiar names: RAND, Urban Institute, Brookings. AEI. Heritage. CAP. New America. Cato. And when journalists cite them or their experts, we’re supposed to be at least inclined towards believing what they say.
But are we — and journalists who rely on them — being lazy and sometimes getting led down the primrose path?
The answer is probably yes. But it’s not too hard to see when this is happening — or to avoid producing journalism that’s flawed by use of think tank research.
There’s no shortage of examples of flawed research being produced and in many cases passed along by media outlets. Sometimes, think tanks produce research that is deeply flawed and has to be retracted. In other cases, studies (think tank and otherwise) turn out to have been faked and passed along by media outlets.
Media reliance on think tanks shouldn’t come as any surprise, given their rise and proliferation in recent years.
Think tanks have replaced (or at least come to compete with) lobbyists as power brokers in Washington, but in the process seem to have lost their intellectual rigor and autonomy from funders and ideology.
For a deep look at the the think tank “corruption” idea, look back to Steve Clemons’ 2003 JPRI Critique Vol. X No. 2, which describes how think tanks have replaced lobbying for advocacy thanks to lobbying regulations and the proliferation of cash-hungry think tanks:
To grow as an institution means struggling to some degree with a Faustian bargain—taking money from donors and, while maintaining the guise of policy objectivity and seriousness, doing the bidding of the lobbyist.
For another view on this topic, read 2007’s Truthiness in Education from the Think Tank Review Project:
At a time when America’s education policymakers have nominally embraced the idea of tying school reform to “scientifically based research,” many of the nation’s most influential reports are little more than junk science..often written by people with little discernible expertise and invariably not subjected to peer review, these reports consistently end with a findings section that supports the ideological preferences of the research sponsor.
Or — if you can find it — check out Kevin Carey’s 2007 ruminations on think tanks: “Mysteries, Puzzles, and Think Tanks.”
More recently, a 2014 Washington Post story (Who funds the new Brookings?) suggested that the new funding (from Walton and others) had an impact on think tanks’ research agendas if not their conclusions
Foundations began to place more restrictions on their grants, part of a challenging new trend facing Brookings and other academic institutions in which donors increasingly specify their expectations as part of what they call ‘impact philanthropy.’
Jay Greene’s The Death of the Think Tank, R.I.P, which notes that the strategy is ultimately doomed to failure because it strips think tanks of their credibility and distinctive role:
Without rigorous research, think tanks just repeat talking points, trying to be more clever in their phrasing and more persistent in their communication so they can be heard above the din of everyone else doing the same.
Dana Goldstein writes about the limitations of reporters’ capacity to digest and describe complicated research in a forthcoming chapter about the MET study.
What to do, then?
In terms of reviewing at the research being produced by think tanks, look not only to funding sources and ideological issues but also to see if the organization has any track record of producing research whose conclusions don’t match funders’ advocacy positions. For me, demonstrated autonomy from funders’ immediate interests is the best single measure of credibility (besides doing good, methodologically sound work).
When reading education journalism, pay careful attention to how journalists use and identify think tank reports and experts in their stories, and be skeptical about claims being made that don’t seem to have been verified or vetted. If you look closely you can sometimes find a think tank report that might have inspired a story or an expert whose views seem to go unchallenged. The think tank’s role usually isn’t presented at the top of a piece or in a straightforward way, but rather slipped into an overview/summary section or given a juicy quote or stat at the end of a piece.
For education journalists, it’s important to present caveats and opposing views and to be clear with readers about the limitations and views of the think tank/expert whose views are being expressed, and who funded the study. Often it can be helpful to find an academic expert to look at a think tank report. Sometimes you can learn a lot by asking what if any of the report’s findings match or conflict with the funders’ views.
My personal view is that exclusives, embargoes, and other hidden arrangements between journalists and think tanks should be indicated to readers who otherwise have no idea where a story came from, but I’m in the definite minority on that front.
Image via Solutions Journalism