As is so often true in rankings based on government information, the Washington Monthly’s recent publication of American colleges with very poor graduation rates entailed a few corrections after the issue went to press.

In all cases this was due to errors by the colleges in filling out a routine annual survey issued by the Department of Education.

The Department allows American colleges incredible leeway in reporting their graduation rates. There’s no strict verification of this data. In fact it’s entirely self-reported. But somehow many American colleges have trouble filling out this information correctly. This doesn’t stop the schools from being very angry. As one university wrote:

[The school] has been incorrectly listed in The Washington Monthly’s “2010 College Dropout Factories Rankings,” … This information is incorrect…. The correct six-year graduation rate… is 22% for the 2001 cohort and 19% for the 2003 cohort.

We are requesting an immediate retraction of the news story and/or a statement, in both the online and print publications, noting the incorrect reporting. This type of false reporting can and will have negative implications for our current and prospective students, alumni, and supporters.

So the reported number would have negative implications but 19 percent would be acceptable? This letter would be laughable if it weren’t so tragic. The fact that any college in America would wish to publicize an abysmal 19 percent graduation is disturbing, to say the least.

Many colleges responded in this fashion, angered that we reported a graduation that they themselves had indicated. In the case of the above university the school had, in fact, verified the information earlier. For many schools, however, it turned out they just didn’t know how to fill out the Department of Education’s survey correctly.

This is the sort of thing we saw often in the course of reporting and editing for the last issue. It was particularly evident in the article on graduation rates, ”College Dropout Factories,” and in Erin Carlyle’s piece on the one of the best community colleges in America, “ Shakespeare with Power Tools.” Some of the most effective schools in America, it turns out, aren’t successful because of earth-shaking intellect and superhuman effort; they’re just effective because the people who work there don’t do a terrible job.

As the Education Sector’s Kevin Carey wrote in the Chronicle of Higher Education a few weeks ago:

Conversations… with policymakers and foundation officials about helping more students earn college degrees… tend to go like this: First, we need a “research strategy” to identify “best practices” that have a statistically significant impact on college graduation. Then we need a “dissemination strategy” to communicate those practices to administrators and practitioners. Colleges will adopt the best practices, and graduation rates will rise.

I think this is mostly wrong.

Maintaining an up-to-date list of available tutors, calculating financial aid accurately, placing students in the right classes, picking up garbage, and maintaining elevators aren’t “best” practices. They are “minimally competent” practices.

Add to that “fill out the Department of Education’s information survey correctly.”

So why does mere competency appear to be such a rare commodity in higher education?