U.S. News & World Report rankings of colleges has long been a source of concern for those interested in determining college quality. Part of the trouble comes from the way that 25 percent of the college ranking (more than any other factor) comes from a survey of a college’s reputation. Essentially this reputation survey, or peer assessment, is a measure of how people “feel” about a college. It turns out the reputation ranking is pretty much fixed from year to year. According to an article in Inside Higher Ed:

Two scholars evaluated changes in reputational scores of colleges and then looked for correlations between those changes and other factors that U.S. News declares are important and recalculates each year: graduation and retention, faculty resources, selectivity and financial resources. The theory behind the study was that if these are key measures of quality in the magazine’s view, institutions that change in these categories should also experience reputational changes over time. But they didn’t — while the correlation that was clear was reputation with the previous year’s rankings.

Basically, the reputation a college had one year was the only indicator of the reputation it would have the next year, regardless of any changes that actually occurred at the school.

Robert Morse, the director of data research for U.S. News & World Report, protests that this should not be much of a surprise, since schools change very slowly. “U.S. News believes that the peer assessment scores are measuring something valuable and help provide highly useful information about the relative merits of schools,” Morse told Inside Higher Ed.

Reputation surveys measure “something valuable” for sure. What’s odd is that U.S. News currently takes it to be the most valuable part of American college quality.

Our ideas can save democracy... But we need your help! Donate Now!

Daniel Luzer is the news editor at Governing Magazine and former web editor of the Washington Monthly. Find him on Twitter: @Daniel_Luzer