Just like the leaves changing colors and students returning to school are clear signs of fall, another indicator of the change in seasons is the proliferation of college rankings that get released in late August and early September. The Washington Monthly college rankings that I compile were released the week before Labor Day, and MONEY and The Wall Street Journal have also unveiled their rankings recently. U.S. News & World Report caps off rankings season by publishing their undergraduate rankings later in September
People quibble with the methodology of these rankings all the time (I get e-mails by the dozens about the Washington Monthly rankings, and we’re not the 800-pound gorilla of the industry). Yet at least these rankings are all based on data that can be defended to some extent–and the methodologies are generally transparent. Even rankings of party schools, like this Princeton Review list, have a methodology section that does not seem patently absurd.
But since America loves college rankings—and colleges love touting rankings they do well in and grumbling about the rest of them—a number of dubious college rankings have developed over the years. I was recently forwarded a press release about one particular set of rankings that immediately set my BS detectors into overdrive. It was about a ranking of the top 20 fastest online doctoral programs, and here is a link to the rankings, which will not boost their search engine results.
First, let’s take a walk through the methods section. There are three red flags that immediately stand out:(1) The writing resembles a “word salad” and clearly was never edited by anyone. Reputable rankings sites use copy editors to help methodologists communicate with the public.
(2) College Navigator is a good data source for undergraduates, but does not contain any information on graduate programs (which they are trying to rank) other than the number of graduates.
(3) Reputable rankings will publish their full methodology, even if certain data elements are proprietary and cannot be shared. And trust me—nobody wants to duplicate this set of rankings!
As an example of what these rankings look like, here is a screenshot of how Seton Hall’s online EdD in higher education is presented. Again, let’s walk through the issues.
(2) Acceptance/retention rate data are for undergraduate students, not for a doctoral program. The only way they could get this data is by contacting programs, which costs money. It also leads to invariable logistical problems.
(3) Seton Hall is accredited by Middle States, not the Higher Learning Commission. (Thanks to Sam Michalowski for bringing this to my attention via Twitter.)
(4) A slightly more important point is that Seton Hall does not offer an online EdD in higher education. Given that I teach in the higher education graduate programs and am featured on the webpage for the in-person EdD program, I’m pretty confident in this statement.
For any higher education professionals who are reading this post, I have a few recommendations. First, be skeptical of any rankings that come from sources with which you are not familiar—and triple that skepticism for any program-level rankings. (Ranking programs is generally much harder due to a lack of available data.) Second, look through the methodology with the help of institutional research staff members and/or higher education faculty members. See if it passes the smell test. And finally, keep in mind that many rankings websites are only able to be profitable by getting colleges to highlight their rankings, therefor driving clicks to these sites. If colleges were more cautious about posting dubious rankings, they would shut down some of these websites. In so doing, they would also avoid the embarrassment that comes when someone finds out that a college fell for what is essentially a ruse.
[Cross-posted at Kelchen on Education]