I magine you’re about to put a chunk of your life savings into a mutual fund. Now imagine you peruse the various “best mutual fund” guides on the news rack, only to find they’re all missing crucial pieces of information. The guides list where the fund managers went to college, how much investment capital they’ve attracted, and what kind of “experience” investors had at the annual fund meeting. But they don’t tell you what you most want to know: What the funds’ rates of return have been–or if they’ve ever made a dime for anyone. You might still decide to invest in a mutual fund, but it would be a heck of a crapshoot. And with their scorecard hidden, fund managers wouldn’t be under much pressure to perform, let alone improve.
That imaginary mutual-fund market pretty much shows how America’s higher-education market works. Each year prospective college students and their parents pore over glossy brochures and phone-book-sized college guides in order to decide how to invest their hard-earned tuition money–not to mention four years of their lives. Some guides, like the popular rankings published by U.S. News & World Report, base ratings on factors like alumni giving, faculty salaries, and freshman SAT scores. Others identify the top “party schools,” most beautiful campuses, and most palatial dorms.
But what’s missing from all the rankings is the equivalent of a bottom line. There are no widely available measures of how much learning occurs inside the classroom, or of how much students benefit from their education. This makes the process of selecting a college a bit like throwing darts at a stock table. It also means that colleges and universities, like our imaginary mutual-fund managers, feel little pressure to ensure that students learn. As anyone who’s ever snoozed through a giant freshman psychology 101 lecture knows, sitting in a classroom doesn’t equal learning; knowledge doesn’t come by osmosis.
To be sure, determining the quality of a college education isn’t as simple as calculating the yield of a mutual fund. But it’s not impossible either. In fact, some reliable measures of student learning, engagement, and post-graduation success have already been developed. These measures reveal where professors are the most effective at teaching, where graduates readily find jobs, and where students walk away with little more than expertise in conspicuous beer consumption. So why haven’t you heard about these measures? Because many school administrators don’t want you to know. Putting their grades on the table is the last thing many colleges and universities want–especially since those grades would likely show that many of the elite colleges so prized by striving students are not backing up their lofty reputations by doing the best job of helping students learn.
Measures that work
There are three basic ways of trying to measure how well colleges educate students. The most obvious is to use some form of a standardized test. That’s how K-12 schools are evaluated. Given the difficulty and controversy K-12 testing has entailed, using standardized tests for college students might seem impossible at first. Elementary and secondary students are at least expected to complete similar courses, to learn the same rules of punctuation and applications of the Pythagorean theorem. Undergraduate studies are far more diverse: Some students choose to spend four years immersed in Ovid, others in organic chemistry.
But there turns out to be an answer: Instead of testing discrete pieces of knowledge, test the higher-order critical thinking, analysis, and communication skills that all college students should learn (and which employers value most). The Collegiate Learning Assessment, recently developed by a subsidiary of the RAND Corporation, does exactly that. Instead of filling in bubbles with a No. 2 pencil, CLA test-takers write lengthy essays, analyzing documents and critiquing arguments.
While several hundred colleges and universities have participated in the CLA, most have kept their results confidential. The University of Texas System, however, has made results public, and they’re surprising. The CLA tests freshmen and seniors, gauging the amount of learning students gain during their college careers. Senior scores are also compared to the scores predicted by students’ ACT or SAT results. The best Texas university by this measure isn’t the flagship, highly ranked UT-Austin campus. The biggest gains are occurring at UT-San Antonio, UT-El Paso, and UT-Permian Basin, all of which are at the bottom of the U.S. News rankings.
The second, better accepted way of measuring university quality is to take one step back from gauging actual learning, and measure the teaching practices and university environments that, evidence shows, usually lead to learning. Years of study have shown that the more time and effort students spent researching papers, interacting with faculty, and studying with classmates, the more they learn, if their efforts are well-directed. Students’ academic engagement can also be measured across disciplines and institutions.
In 1998, a group of educators sat down to translate the research on how students learn into an assessment tool for colleges. Convened by Russell Edgerton, former president of the American Association for Higher Education and then director of education programs for the Pew Charitable Trusts, an all-star cast of higher-education experts developed a comprehensive survey for freshmen and seniors to report the number of books read, papers written, hours spent preparing for class, as well as indicators of student collaboration, student-faculty interaction, and the overall campus environment.
This evaluation, called the National Survey of Student Engagement (NSSE), was launched two years later, with over 275 colleges and universities participating. As of 2006, nearly 1,000 colleges have been evaluated, each receiving a detailed statistical analysis of how well its students are being academically engaged. Housed at Indiana University and administered annually at a cost to each college of as little as $1.50 per student surveyed, NSSE not only shows colleges how well they’re performing but how they stack up against the competition–for instance, whether their school ranks above or below average among peer institutions for faculty providing prompt feedback to students about their work.
Edgerton and Pew convened the original 1998 meeting looking for an alternative to the U.S. News rankings. But after investing over $3.5 million to develop and roll out the survey they wanted NSSE to be widely used and financially self-sustaining. That meant getting a lot of institutions to both agree to participate and pay for the privilege. Many were willing, on one condition: the results would be kept in-house and away from public eyes. Institutions knew that public data would inevitably be used to rank and compare colleges. They didn’t know where the survey would put them and were worried about looking bad relative to their peers.
As a result, NSSE results for most colleges are–like results from the CLA–unavailable to the public. U.S. News has asked for NSSE results, but has only been able to publish what institutions release voluntarily. Less than 15 percent of colleges ranked by the magazine have complied, and none of the top-tier national universities have released results. The newsmagazine Maclean’s, which ranks Canada’s 47 universities, recently tried a different tack, using freedom of information requests to pry NSSE data out of Canadian public university hands. But it would be an immense legal challenge to use this approach for the many hundreds of U.S. universities, and private colleges wouldn’t have to comply. The only way to get full NSSE data on all schools would be to make disclosure mandatory. Sen. Edward Kennedy (D-Mass.) floated legislation to do exactly that a few years ago, but it was quickly torpedoed.
It’s understandable that the higher-education establishment–in particular the elite, sought-after schools–would have deep qualms about giving prospective students access to NSSE results: By all indications, that data does much to undermine those schools’ claims of superiority. Though NSSE doesn’t release data about individual institutions, it does release studies based on that data. In a 2005 report, NSSE analysts found no statistically significant relationship between effective teaching practices and admissions selectivity as rated by the popular Barron’s Guide to Colleges. Like the CLA, NSSE suggests that the long-established higher education pecking order may have little to do with who educates students best.
The work connection
The third way to judge colleges is by measuring what happens to students after they graduate, such as how quickly they find work and how likely they are to receive promotions. Only a handful of elite universities attempt to maintain databases of high-earning alumni; most institutions have no idea what careers their graduates enter. But this information is actually available; it just hasn’t been connected in the right way. State governments gather data about earnings and field of employment for virtually every wage-earner in the nation, so that they can calculate unemployment insurance benefits for people who are laid off work. This data can be matched with student records provided by colleges and universities.
That would give students and parents a huge amount of new, detailed information about which colleges help their graduates get jobs in their field of study and earn a good living. Say you’re a Hispanic high-school senior who wants to design the next-generation space shuttle or send men to Mars. You’d want to know which universities nationwide graduate the most Hispanic engineers who get well-paying jobs in the aerospace industry. Linking education and employment data–information that already exists today–would give you the answer.
A handful of states have already made the connection. The Florida Department of Education publishes an annual list of how much money graduates of the state’s nine public universities who stay in state to work earn the fall after graduation. The results aren’t what one might expect. 2004 graduates of the University of Florida–the state’s most prestigious and selective public university, and a top-tier institution according to U.S. News–earned $25,773 per year on average. Graduates of Florida International University, which U.S. News puts in the bottom tier, earned $34,756, the highest in the state. Once again, some low-ranked universities appear to be doing better than the conventional wisdom gives them credit for.
But these numbers don’t appear to have had much of an impact on college choices in Florida–partly because they only apply to students who attend public schools and stay in-state to work, and partly because the Florida Department of Education doesn’t even list them on its Web page for students trying to choose a college. Students need ready access to detailed employment outcomes for every college and university in the country. That would take action by the federal government. Unfortunately, the private college lobby is bound and determined to prevent that from ever occurring.
The U.S. Department of Education recently asked colleges and universities to submit enrollment, graduation, and financial records for every student, in order to better calculate things like institutional graduation rates and student costs. Yet despite the fact that strict federal privacy laws would prohibit the names of individual students from ever being released, lobbyists for private colleges put on a full-court press to block the proposal, loudly denouncing it as “Orwellian” and “an assault on Americans’ privacy and security in the shadow of the Fourth of July.”
Keeping it hush-hush
Higher education’s reluctance to be held publicly accountable for student results is particularly frustrating because years of solid research show there is much they could be doing–but aren’t–to improve the way they teach students. From 1999 to 2004, Dr. Carol Twigg of the National Center for Academic Transformation at the Rensselaer Polytechnic Institute worked with 30 colleges and universities to improve their large introductory classes (50 percent of all enrollments at community colleges and 35 percent of enrollments at four-year schools are in just 25 introductory courses in foundational subjects like English and biology). Instead of passively absorbing information in a cavernous lecture hall, students worked in active learning environments where they had online access to tutorials, student discussion groups, and real-time, on-demand feedback and support. The technology also reduces the amount of time instructors need to prepare lectures, introduce content, and grade homework, lowering staff costs per student taught.
The result: more learning at a lower cost to the university. Scores in a redesigned biology course at the University of Massachusetts increased by 20 percent, while the cost to the university per student dropped by nearly 40 percent.
But while Twigg’s efforts are widely known in higher education circles, there has been no great rush to replicate them nationwide. That’s because college administrators don’t feel much pressure, for the sake of their careers or of the bottom line, to copy educational best practices. Indeed, universities are notorious for basing their hiring and tenure decisions on publishing and prestige, hardly indicators of the quality of teaching. The amount of time a professor devotes to publishing may be inversely related to the quality of undergraduate instruction. Improving educational quality is a fundamentally optional goal for colleges–and that’s unlikely to change without external pressure.
The refusal of colleges and universities to be held accountable is beginning to create a backlash in Washington, though not necessarily a helpful one. Earlier this year, the GOP-controlled House of Representatives voted to require all colleges and universities that increase tuition and fees by more than double the inflation rate to justify their actions to the federal government. While it stems from an understandable concern about college affordability, the move smacks of Nixon-era price controls (ironic coming from a Republican Congress). But the bigger problem is that lawmakers are mandating new reporting in the part of the market that’s already most transparent: tuition and fees. Instead of trying to regulate the price of college, Congress should require colleges to disclose what students and parents don’t already know: the quality of teaching in the classroom. Then students can decide for themselves if a particular institution is worth the money.
That kind of federally mandated disclosure is anathema to colleges that have long enjoyed a comfortable cocoon of privacy. But it would certainly be familiar terrain for the tens of thousands of publicly traded companies that file detailed financial reports each quarter with the Securities and Exchange Commission. Despite the fact that these companies don’t enjoy higher education’s tax-exempt status, they are far more transparent and accountable to the public.
The higher-education sector is ultimately driven by the market. Colleges and universities will strive and compete on whatever terms the market provides. As long as status and success are predicated on building endowments and recruiting more students with high SAT scores, college leaders will continue to focus on fundraising, marketing, and little else. If, on the other hand, success meant teaching students effectively and helping them do well in their lives and careers, universities would change their priorities. They’d recruit better teachers and seek out the Carol Twiggs of the world. Institutions that had focused on their mission–teaching students–and been great at it would finally get the recognition they deserve. But success can’t be defined by teaching and learning until public information is available about teaching and learning. We live in a time where that kind of information is finally within reach. Too bad most of it is locked behind ivy-covered walls.