I’ve written before about the problem of academic cheating in college, particularly students who turn in papers they didn’t really write. This, of course, has always happened in college, but the Internet makes it a lot easier for students, which is probably why professors started to notice a problem.

And so entered Turnitin, a program designed to help professors analyze student papers to detect plagiarism. The program works when universities and high schools buy expensive licenses so they can use a system where students send all essays to the Turnitin website before professors give their work a read. Turnitin “checks the documents for unoriginal content.” But is it really working?

I’ve long been skeptical of this program, particularly because the company that owns TurnItIn also provides a product that seems suspiciously designed to help students cover up plagiarism, but TurnItIn says it’s effective. Now the company has research to prove it. According to a piece at Inside Higher Ed:

In a study… released Wednesday morning, Turnitin tracked the decrease in “unoriginal writing” — meaning writing that scored 50 percent or higher on the software’s Overall Similarity Index — at 1,003 non-profit colleges and universities in the U.S. that had used Turnitin for five years.

Most institutions started experiencing drops in unoriginal writing by the third year of Turnitin use, and by year four, not a single type of institution reported an increase. In the fifth and final year of the study, every class posted a double-digit decrease, ranging from 19.1 percent among four-year institutions with fewer than 1,000 students to 77.9 percent among two-year colleges with 3,000 to 5,000 students. Overall, unoriginal writing decreased by 39.1 percent.

Here’s the graph showing the decline:

Wow, almost 40 percent. But there’s reason to believe Turnitin might be a little, well, creative here in its conclusions

Stanford education professor Thomas Dee, said he’s a little skeptical of the results. The report, he said, seemed to be made of “suggestive, descriptive evidence,” not “convincing causal evidence.”

It’s a kind of selection bias story. They’re looking into the prevalence of unoriginal content over time — but think about the professors who might be picking up Turnitin. They may be the ones who feel they have the biggest problem with plagiarism in their class, so you’re going to get a very high baseline of unoriginal content. The profs who were late adopters could be those who see only marginal benefits. That would also create the appearance of a treatment effect when none may exist.

Perhaps more importantly, however, even if the decline were very real, it’s not entirely clear it’s Turnitin that caused this apparent drop. The use of the Internet made colleges institute lots of changes to protect academic integrity. It’s not all about the one software program.

Many schools now require students to sign specific anti-plagiarism pledges at the beginning of the year. Some have increased the penalties for students found guilty of the offense. Some schools and colleges have also restructured academic work to make cheating harder.

Daniel Luzer

Daniel Luzer is the news editor at Governing Magazine and former web editor of the Washington Monthly. Find him on Twitter: @Daniel_Luzer