Does Plagiarism Detection Technology Work?

Policymakers generally tout technology in college as a good thing, since it offers the potential to (theoretically) cut costs and expand access. But when academics talk about new innovation in higher education they’re often complaining. And that’s because technology has made academic cheating a lot easier.

Pretty much whatever topic a student is assigned for a college paper has been assigned to some other college student at some time in the past. Students can just download a paper from somewhere and hand it in the next day after spending only 20 minutes or so reformatting.

Luckily there are technological solutions to this. They come in the form of computer programs like Turnitin. For a fee universities can get access to a program that allows professors to upload student papers and then check them against other papers for plagiarized content.

This sounds useful, but it turns out the programs often don’t really work so well. According to an article at Inside Higher Ed:

The data come from Susan E. Schorn, a writing coordinator at the University of Texas at Austin. Schorn first ran a test to determine Turnitin’s efficacy back in 2007, when the university was considering paying for an institutionwide license. Her results initially dissuaded the university from paying a five-figure sum to license the software, she said. A follow-up test, conducted this March, produced similar results.

Here’s what happened:

Schorn created six essays that copied and pasted text from 23 different sources, which were chosen after asking librarians and faculty members to give examples of commonly cited works. Examples included textbooks and syllabi, as well as websites such as Wikipedia and free essay repositories.

Of the 23 sources, used in ways that faculty members would consider inappropriate in an assignment, Turnitin identified only eight, but produced six other matches that found some text, nonoriginal sources or unviewable content. That means the software missed almost two-fifths, or 39.34 percent, of the plagiarized sources.

Programs like Turnitin also tend to flag content as “not original” even when it’s not plagiarism at all, but just sort of bad writing. In particular it tends to find problems with jargon and just the sort of unoriginal ideas that are really pretty common in normal, original freshman papers.

What might be a better way to prevent against plagiarism? Making students submit a bibliography and multiple drafts might be the most effective method, but that’s quite time consuming (and a little patronizing to students). What works better is pretty simple, it seems. According to the article:

Google — which Schorn notes is free and worked the fastest – trounced… proprietary products. By searching for a string of three to five nouns in the essays, the search engine missed only two sources. Neither Turnitin nor SafeAssign identified the sources Google missed.

Programs like Turnitin might just not be worth the huge amounts colleges are paying. The total institutional cost can get up to $30,000, depending on enrollment. Maybe it’s just not worth bothering with such things.

Support the Washington Monthly and get a FREE subscription

Daniel Luzer

Daniel Luzer is the news editor at Governing Magazine and former web editor of the Washington Monthly. Find him on Twitter: @Daniel_Luzer