Sept-18-Carey-Numbers
Credit: Chris Matthews

The University of South Florida looks a lot like the surrounding city of Tampa Bay, which is to say that it looks like most of the American places that have filled up with people over the last 100 years: sunny and sprawling and composed of newish brick, glass, and concrete buildings between five and ten stories high. You may not have heard of USF, or at least not in a way that made a lasting impression, because it isn’t a synecdoche for the allegedly meritocratic class system, nor the home of a championship football team. It is merely the kind of university where most young adults actually get a college education.

Like many of the public institutions that make up the backbone of the American higher education system, USF’s future is uncertain. Four decades of tax-revolt politics and mismanaged finances have led state legislatures to slash public funding for higher learning, forcing public universities to raise tuition and compete on the open market against schools with comparative advantages. Community college degrees are cheap, short, and job focused. The most prestigious universities provide the stamp of selectivity. For-profit schools aggressively market convenient online programs. Small liberal arts colleges offer a personalized experience where everybody knows your name.

Check out the complete 2018 Washington Monthly rankings here. 

While USF is trying to compete in each of these arenas (the football team was 10–2 last year), the main point of being a sizable public university is to serve a large and diverse population of students. Not just the top 1 percent, but people who represent the whole population of Florida, including immigrants, first-generation students, low-income families, and people of color. Educating 51,000 students at three campuses spread across Tampa, Sarasota, and St. Petersburg can lead to the standard rap on big state schools, namely that students are anonymous, unnoticed, and disconnected—“just a number.”

Which is why the most interesting thing about USF is how it is turning the whole idea of students as numbers on its head. Numbers, after all, can be analyzed. They can be acted upon and changed. The real tragedy of modern higher education is when students aren’t even seen as numbers—when, in other words, they aren’t seen at all.

USF and a small but growing number of colleges and universities are at the forefront of using information technology and advanced statistical analysis to see students in whole new ways. By sifting through vast stores of information that have accumulated in various administrative and educational data systems, they are discovering patterns about students that they never knew about before—why some succeed while others fail, and what can be done to help them. As a result, they’re starting to crack the stubborn, widespread problem of high college dropout rates, and point toward a future where besieged public institutions can continue to thrive.

“Predictive analytics” is the name for this data-dependent approach, and you can be forgiven for wondering if it’s just another overhyped high-tech fad. The techno-optimism of the late 1990s and early 2000s has given way to a world in which the tendency of technological innovation to improve our lives is much more in doubt. The exploitation of our personal data can border on the dystopian, whether it’s Cambridge Analytica mining our Facebook profiles or health insurance companies raising our rates based on our online shopping history.

But there are reasons to believe that predictive analytics in higher ed could end up being the real deal. First, institutions like USF have already shown that it can work to improve student outcomes. At Georgia State University, for example, analytics was a key part of a push that saw a 30 percent jump in bachelor’s degrees conferred in a five-year span. Second, there is an ecosystem of vendors who already know how to gather and interpret the data—schools don’t have to figure it out on their own. Finally, there are powerful economic incentives for other schools to hop on the bandwagon. As more states shift to performance-funding mechanisms that reward colleges for rates of persistence (students sticking around year to year) and graduation, there’s more pressure to figure out not just how to get students to apply and attend, but how to keep them in once they arrive. And with a flattening or even declining population of college-age students, all but the most selective schools will have to start considering a broader pool of applicants, including ones who might be less academically promising. Figuring out how to get more of them to graduate means more years of tuition dollars coming in.

If more institutions start behaving like USF, the results could be enormous. Even boosting graduation rates by just 5 percent nationwide would mean another hundred thousand people earning a bachelor’s degree every year.

But there are headwinds pushing against the successful spread of predictive analytics. Universities may be politically liberal, but they’re institutionally conservative. Administration and academics tend to be walled off from each other. Making productive use of all the data that now exists on students will require them to shed some deeply ingrained bad habits. Collecting the data is the easy part. Using it the right way is the challenge.

The state of Florida began the twentieth century with three public universities: Florida State and Florida A&M in Tallahassee, and the University of Florida in Gainesville. By the mid-1950s, it still had only three, for the simple reason that most of the state below Gainesville was a hot, humid, barely habitable swamp for half the year. But then affordable air conditioning became available, so the legislature established a new public university in what was then, practically speaking, south Florida: USF, in Tampa Bay.

As the state grew in population over the next half century, USF grew with it. The state legislature encouraged expansion by providing financial rewards to universities that got larger. It didn’t matter how many students graduated, just how many walked in the front door.

Then 2008 happened. Florida’s economy had been buoyed by the housing bubble and suffered immensely during the crash. To manage a shrinking budget, the legislature made a fundamental change. In 2013, it passed a law establishing “performance funding” for Florida’s public universities. A pool of state funding would be directly tied to a school’s score on certain criteria, including persistence and graduation rates. The three worst-performing schools would miss out on this funding entirely.

This was a challenge for Florida university administrations, in large part because of the peculiar way modern universities are organized. Traditionally, the administration is in charge of facilities and finances and athletics. It also runs the admissions and financial aid offices and so decides which and how many undergraduates to enroll. But once students arrive on campus, they’re turned over to the faculty, who advise and teach however they see fit. The relationship between professors and administrators at the typical research university is less an ongoing partnership than a fragile detente between warring tribes.

But USF had a head start. It had already launched an initiative aimed at improving student outcomes under the leadership of its well-respected president, Judy Genshaft, who had been on the job for ten years and had built up credibility with congenitally suspicious faculty. In 2010, the administration appointed a respected historian of Latin American history named Paul Dosal to be the university’s vice provost for student success. It was a new job title, one that existed at few if any other universities at the time. The fact that “student success” is not a goal around which most colleges are administratively organized says a lot about why so many college students are unsuccessful.

There are reasons to believe that predictive analytics in higher ed could end up being the real deal. Institutions like USF have already shown that it can work to improve student outcomes, and there are powerful economic incentives for other schools to hop on the bandwagon.

USF began by turning college advising over to a cadre of trained professional full-time advisers. Faculty were happy to give up the responsibility, since it meant more time for teaching and research. The advisers noticed—and USF data confirmed—that many students were slamming into a relatively small number of required courses with high rates of withdrawal and failure. That created frustration and squandered time and money—two things USF’s most vulnerable students didn’t have to spare. The courses were reengineered, without, the university notes, compromising academic rigor.

The university also took a hard look at its financial aid system. Historically, colleges have spent the most time and energy examining students before they actually enroll. Drawing on methods honed in the cauldron of airline ticket pricing, expensive consultants help colleges analyze the troves of financial information students and parents are required to disgorge during the admissions process, along with standardized tests, high school transcripts, and indicators of student interest (campus visits, legacy status, interviews), to estimate how likely a given student is to apply for admission, the odds of them accepting an offer, and the price it will take to close the sale.

USF realized that it had been misallocating financial aid. Some students were getting more than they needed, others not enough. The financial aid office redirected aid to students when it made the difference between them enrolling part time or full time, or between studying during the day and holding down a full-time job. Research shows that, as you’d expect, full-time students with room in their schedule for schoolwork are far more likely to graduate on time.

The reforms worked. The percent of freshman who graduated in four years increased from 43 percent for those who enrolled in 2009 to nearly 60 percent for those who enrolled in 2013. Impressively, the gains were spread evenly across demographic groups. Slightly more than half of USF’s undergraduates are nonwhite, and nearly 40 percent have income low enough to qualify for a federal Pell Grant. USF’s graduation rates for minority and low-income students are as good or better than their white and more affluent peers—one of the reasons USF’s Washington Monthly ranking is substantially higher than its ranking from U.S. News & World Report.

But by 2015, the improvement had started to flatten out. Meanwhile, the performance funding bill passed by the legislature had dangled another incentive: if USF could meet a series of goals, including a 90 percent rate of freshmen returning for their sophomore year and 70 percent graduating in six years, it would be designated the state’s third “preeminent” public research university, alongside the University of Florida and Florida State, which would make it the first addition to that club since the nineteenth century.

Dosal and his peers knew that the only way to make more progress was to delve even deeper into the sacrosanct realms of teaching and learning, by taking advantage of information the university had never had before.

At USF, as with nearly all universities, student services like health care and residential life had been kept on one side of an administration/faculty demilitarized zone, with academic affairs on the other. So Dosal—recently and very deliberately promoted to vice provost for student affairs and student success—created a new body called the Persistence Committee. Every Thursday morning during the academic year, representatives from fifteen different sections of the university convene in a nondescript conference room, around a set of four beige tables pushed together into a rectangle, and look at pie graphs on a computer screen.

The numbers on the screen are generated by a private Texas-based company called Civitas Learning. Colleges hire Civitas to analyze the troves of digital information that are now routinely gathered about anyone enrolled in higher education. Some data comes from the admissions office, as well as bureaucratic databases maintained by the bursar and registrar. More information is pulled from the university’s online learning management systems (LMS), where students watch videos, join discussion groups, collaborate on writing assignments, and conduct virtual experiments. Civitas even looks at data from the all-purpose smart cards that keep track of students’ meal balances, store money to spend at the bookstore, and unlock the doors to the library, gym, and dorm. College students circa 2018 spend their days in a cocoon of digital information. Who they went to high school with, what they buy, where they go, when they show up for class, and how they learn—it’s all recorded in a university-controlled data system.

Civitas sifted through ten years of USF data and examined over 300 different variables to identify warning signs for students at risk of dropping out. They found that student behavior is much more predictive than simple demographics. Students who log on to the LMS, download materials, click on lectures, and contribute to discussions are much less likely to drop out than students who don’t. Student engagement can now be measured in real time.

These insights are used to analyze USF’s recently enrolled students, flagging risk factors in their prior academic record, and, more importantly, patterns of behavior once they start taking classes. This data is what the Persistence Committee looks at every Thursday morning.

Usually, most of the students are coded blue for a “Very High” likelihood of persisting. (USF is more selective than most public universities, with a 45 percent acceptance rate and an average SAT score of over 1,200.) But there are also hundreds of students assigned to increasingly alarming colors, all the way to red for “Very Low.” These students are in crisis, right now. Some of them were in the blue category until very recently, when something set off alarm bells inside the system. The job of the Persistence Committee is to figure out what went wrong, and to help before it’s too late.

One morning last fall, the system flagged a young man named Christian. His persistence score had been oscillating back and forth for several weeks, suggesting that he was going days without engaging online with any of his classes. His high school record and aid application didn’t raise any obvious red flags—he was financially and academically prepared to succeed. But something was going wrong.

So the committee dispatched an “academic advocate” to find out more. USF’s Academic Advocacy office is new, created in 2013 and separate from academic advising. It’s another title that doesn’t exist at most universities. Advocates work full time, using a case-management approach, to help students navigate past barriers to graduation wherever they arise. Christian, it turned out, was far from his home in New York and away from his family for the first time. His longtime girlfriend lived in North Carolina, adding to his sense of isolation. He and his advocate began to meet regularly, developing a friendship. He was interested in business, so she steered him toward a friend in the business school for advice, which led to his declaring an interest in majoring in business analytics and intelligence. He managed to keep afloat academically in the first semester, and he raised his GPA to 3.78 in the spring as he moved into the courses that excited him. Now he’s settled in and on track toward graduation.

Another student, Nathan, wasn’t flagged by the system until his first-semester midterm grades came in. He was trying hard, engaging with classes, but not succeeding. Another academic advocate took the case. Nathan didn’t have just one problem: he needed help with course material, time management, and making connections to other students. Meanwhile, his bad midterms were threatening a catastrophic first semester and academic probation. The advocate did triage, helping Nathan withdraw from two classes and focus on the rest. The spring semester went better, and he’s getting ready to start an IT major soon. Sometimes the problems are financial, and the bursar’s office can arrange a small grant or loan.

Sometimes the vagaries of dormitory living are the issue, and residential life intervenes. Socially isolated students might benefit from coaching from upperclassmen. Substance abuse and mental health problems aren’t uncommon among today’s college students, which is where the health services people come in.

Advanced analytics can yield better, faster insights into which students are faltering. But the toughest cases almost always require human judgment to understand what kinds of help students need. In some ways, the most radical innovation at USF isn’t the sophisticated statistical analysis on the computer screen. It’s the table all of the different university offices are sitting around at the same time.

So far, the Persistence Committee appears to be working. The numbers are starting to move up again. Last year, the university received $84.6 million from a state funding formula that rewards persistence, graduation, and the percent of students eligible for Pell Grants, more than any university in the state other than the flagship University of Florida. And in June, thanks to its freshman persistence rate cracking the 90 percent threshold, USF was officially granted “preeminent” status.

Now other universities are coming to USF for advice. The question is whether it’s just an example of an unusually well-run university finding new ways to be better, or the harbinger of a revolution in the way colleges and universities help students succeed.

As a sector, higher education is relatively late to using predictive analytics to improve student performance. Since the 1990s, municipal police departments have been identifying geographic areas where crimes are likely to occur. In the private sector, car companies crunch the numbers to predict whether customers will buy a car at the end of a lease or switch to a new lease with a competitor. Elsewhere in Florida, Hillsborough County has pioneered statistical analysis to identify who in the child welfare system is most at risk.

The use of advanced statistical analysis to optimize the admissions and financial aid process dates back to the 1970s, proving that colleges are very capable of devoting their significant intellectual and financial resources to problems that affect their economic well-being. But industry surveys suggest that USF, while no longer unique, is still among the minority of colleges and universities analytically focused on student success. A 2016 KPMG report found that colleges are still most likely to use data analytics for budgeting, enrollment, fund-raising, and “supply chain optimization,” as opposed to academics. Another report found that, even at universities that are using analytics on behalf of students, less than 10 percent have created the kind of institution-wide student success dashboards used by USF. In part, that’s because boosting front-end enrollment and leaving success to the faculty remains a tempting short-term method of fattening the bottom line.

But the numbers are definitely growing. By the mid-2000s, it had become clear that universities were sitting on large databases that could be used for more than just keeping track of course credits and parking tickets. In the late 2000s, just as USF was starting on its quest to improve student success, the Bill & Melinda Gates Foundation (which, full disclosure, provides financial support to New America, where I work, and to the Washington Monthly) convened a group of researchers and practitioners who were interested in analyzing data about student success. Mark Milliron, then a Gates program officer, went on to cofound Civitas. Those meetings were a nexus of parties who have since spread out to form a burgeoning field. Georgia State University has gained national prominence by using predictive analytics to improve graduation rates for black students in Atlanta, helped by the consulting firm EAB. Other colleges have built home-grown systems for similar ends.

Cost is a factor slowing the spread of analytics. In addition to paying consultants, it’s expensive to hire a whole new staff of academic advocates and advisers. Less-pricey alternatives include automated solutions like sending out text message reminders to students whom the data flags as only mildly at risk. USF has been experimenting with these strategies, too.

Advanced analytics can yield better, faster insights into which students are faltering.But the toughest cases almost always require human judgment to understand what kinds of help students need.

But automation creates potential pitfalls. Algorithms don’t have ethics. A system that blindly works to optimize the odds of student persistence might end up advising low-income and minority students to avoid courses where low-income and minority students have historically fallen short. That course could be the gateway into careers in science and medicine. It might have been historically taught in a way that was indifferent to the particular, addressable challenges that students from different backgrounds bring to college.

This new, active approach to managing college education is arguably returning colleges to a form of the in loco parentis role that they abandoned after the sexual revolution. Instead of policing student morals and protecting undergraduate virtue, colleges have become data-fueled fairy godmothers, invisibly looking over each student’s real and virtual shoulders, ready to intervene when a crisis comes. Of course, that’s arguably another way of saying “surveillance.” By matriculating, students are opting in to the kind of constant, multi-valent algorithmic scrutiny that increasingly characterizes modern life. But the flip side of scrutiny is indifference. When colleges ignore at-risk students, those students usually fail.

There’s also the problem of changing organizations that generally maneuver like an ocean liner whose captain requires unanimous approval from three committees before turning the wheel. Unusually, USF has had the same president since 2000 and the same provost since 2008. That kind of continuity buys time to build confidence and buy-in from the faculty, who tend to be deeply conservative in their attitude toward reform.

Even then, you need the right leadership. Not someone like, say, former Mount St. Mary’s University president Simon Newman, who told his staff in 2015 to improve the freshman persistence rate by administering a survey to new undergrads and then using the results to kick out the most at-risk students before the date in late September when freshman persistence rates are calculated. “This is hard for you because you think of the students as cuddly bunnies,” he told his staff via email, “but you can’t. You just have to drown the bunnies . . . put a Glock to their heads.” (Newman resigned after the school newspaper reported the email.)

The true frontier of college data analytics is inside the classroom. As progressive and successful as USF has been, the big table where the Persistence Committee meets is still missing the faculty themselves. The Civitas process can see course grades and test scores and the intensity and frequency of how students engage with their courses. But neither the analysis nor the solutions are truly centered on core acts of learning. That remains the province of professors, students, and the shrouded relationships between them.

One industry survey asked colleges what their analytic data was used for. For enrollment and financial aid, about two-thirds use data to make projections and more than a quarter expect a proactive response from the people charged with aid and admissions decisions. For “faculty teaching performance,” less than 3 percent expected a proactive response. Some professors, particularly the younger digital natives, embrace the possibilities of teaching with digital information. But it remains, as always, up to them.

This will be the future organizational battlefield of higher learning. Established colleges and universities retain many advantages: brand names, regulatory protection, public subsidies, tax preferences, credential-to-employment path dependency, sheer cultural capital. But the same tools that make innovations like the Persistence Committee possible can also be used by new organizations unburdened by old administrative traditions. The most effective colleges will empower teachers with yet-to-be-invented methods of analyzing data about how individual students learn. Teaching that way won’t be voluntary. It will be the creed and mission of colleges that live and thrive.

Our ideas can save democracy... But we need your help! Donate Now!

Kevin Carey directs the education policy program at New America.