Political scientists — not to mention politicians — have for decades been preoccupied with the question of how citizens come to believe what they do, why they are so easily led astray by false stories, and what can be done to correct this when it occurs.

An interesting new study out of Duke points out just how complicated it can be to correct false beliefs — particularly in the chaos of an election-season media environment.

From the study’s press release:

The researchers gave 50 Duke undergraduate students a 120-question test on basic science information, with questions including: What is stored in a camel’s hump? How many chromosomes do humans have? What is the driest area on Earth? After answering each question, students rated their confidence in their response, and then received the correct answer as feedback. Half the students were retested six minutes later, while the other half were retested one week later.

Students who were retested immediately corrected 86 percent of their errors. As expected, their responses showed a hypercorrection effect — they were more likely to correct errors that they had made with high confidence relative to low-confidence errors.

In contrast, students who were retested one-week later also showed a hypercorrection effect. However, these students only corrected 56 percent of their errors, indicating they had forgotten many of the correct answers that they had learned from the feedback.

When students forgot the correct answer over the one-week delay, the opposite of the hypercorrection effect occurred — the higher their confidence in their initial error, the more likely they were to re-produce that same error on the final test.

It’s hard not to read this as a blow to the ambitions of sites like Media Matters and FactCheck.org, especially when you take into account how the average American consumes their political media.

If Teddy, a somewhat representative American consumer of political news and opinion, has long believed that Obama raised his taxes, and has long been confident in this belief — why wouldn’t Obama raise taxes, after all? That’s just what those Democrats do! — then any correction of this belief may well prove to be temporary. It’s not as though Teddy will be seeking out debunkings of his beliefs, so if he happens to stumble upon one it will likely quickly be swamped by his usual diet of conservative websites.

So it’s not just a matter of correcting false information. It has to be done in a very precise, intelligent way, and I don’t think anyone has yet mastered this science. Andrew Butler, the lead researcher of the Duke study, notes, “If students practice retrieving the correct information, then they may be able to avoid reverting back to their deeply entrenched false knowledge.”

Might not work in a political communications context — what are you going to do, tell Teddy that Obama didn’t raise his taxes and then ask him to write that fact on a blackboard a hundred times so it sinks in?

Jesse Singal

Jesse Singal is a former opinion writer for The Boston Globe and former web editor of the Washington Monthly. He is currently a master's student at Princeton's Woodrow Wilson School of Public and International Policy. Follow him on Twitter at @jessesingal.