DALLAS — Stephanie Dupaul jokingly consults her collection of Magic 8 Balls — those novelty toys that tell your fortune through a little window at the base — when her students ask her things like, “Will I get an A in that class?”
Now she can answer that question with a great deal more accuracy.
Associate provost for enrollment management at Southern Methodist University, Dupaul is one of a growing number of university administrators quietly consulting years of data covering millions of grades earned by thousands of former students to predict how current ones will do — and to catch them before they fall through the cracks.
It’s the same kind of process Amazon and Google employ to predict the buying behavior of consumers. And many of the universities and colleges that are applying it have seen impressive declines in the number of students who drop out, and increases in the proportion who graduate, at a time of intensive pressure to improve their own performance in those areas. At a summit of college and university presidents convened by the White House on Dec. 4, the Obama administration pushed the use of data to improve graduation rates nationwide.
Tracking data in this way also keeps tuition coming in from students who stay, and avoids the cost of recruiting new ones, which the enrollment consulting firm Noel-Levitz estimates is $2,433 per undergraduate at private and $457 at four-year public universities.
“It’s a resource issue, it’s a reputational issue, it does impact — I’ll say it — the rankings” by improving graduation rates, Dupaul said.
At SMU, for instance, the data showed that students who applied early in the admissions process were more likely to ultimately earn degrees. So were those who visited the campus before enrolling, joined a fraternity or sorority, or registered for a higher-than-average number of classes.
From this and other knowledge, the university has built a predictive algorithm that can gauge the probability that a student will finish, and prop up those who might not.
Other universities use extraordinarily detailed data to make sure students stay on track once they’ve arrived. Georgia State, for instance, has analyzed 2.5 million grades of former students to learn what may trip up current ones. That early-warning system, begun in 2012 to address a lower-than-the-national-average graduation rate, triggered 34,000 alerts last year about students who may have been in trouble, but didn’t know it yet.
For example, the data show that students’ grades in the first course in their majors can predict whether or not they will graduate. Eighty-five percent of political science majors who get an A or B will earn degrees, but only 25 percent of those who score a C or lower.
“What we used to do, and what other universities do, is let the C student go along until it was too late to help them,” said Timothy Renick, Georgia State’s vice president for enrollment management and student success. “Now we have a flag that goes off as soon as we spot a C in the first course.”
That student is invited to meet with an advisor and given the option of switching majors before wasting further time and money on a losing proposition.
The university also uses its predictive algorithm to channel incoming freshmen with wobbly prospects — say, who come from high schools where graduates in previous years have proven poorly prepared — into a seven-week summer session. Nine out of 10 of these make it to the end of the first year, more than their classmates who are not judged to be at risk.
It’s not just freshmen who are being monitored in this way. Another 2,000 upperclassmen were hauled in for one-on-one sessions with an advisor last year when they signed up for courses that didn’t satisfy requirements for their majors — which the data showed would probably derail them — and moved to classes that did.
“Most students, when they take classes that don’t apply to their program, it’s not because they’ve always wanted to take a course in Greek philosophy,” said Renick. “It’s because they don’t understand the maze of rules that big institutions like Georgia State have created. And when they go off course, it’s a difference between graduating and not graduating.”
The university also uses 12 years of data from former students to nudge current ones toward majors to which data shows their academic strengths are most closely matched, and in which they’re most likely to succeed.
“It’s a really simple process,” Renick said, “but it’s the kind of thing that higher education hasn’t been doing.”
And still largely isn’t. The number of universities using data in this way is growing exponentially, but it’s still only about 125, according to a rough estimate by higher-education consultants who follow this. That is only a small proportion of the total number of universities.
The reason more are signing on has as much to do with the bottom line as with the goal to help students. For every 1 percentage point improvement in the proportion of students it keeps from dropping out, Renick says, Georgia State holds onto $3 million in tuition and fees that would have otherwise been lost. So far, that rate has increased by five percentage points since the university started tapping this data two years ago, meaning it has more than recouped the $100,000-a-year cost of running the system and the $1.7 million per year it takes to pay an extra 42 advisors hired to help the students it predicts might fall between the cracks.
“It’s no longer just a moral imperative. It’s a financial imperative,” said Ed Venit, a senior director at a research and technology company, the Education Advisory Board, one of several private firms that do this work, which Georgia State and other universities have hired to help them with this process. “The students who are on their campuses now, they have to keep them around, hopefully till graduation.”
Yet graduation rates overall are down, not up, since 2008, according to the National Student Clearinghouse, which tracks this. Only 55 percent of students earn their two- or four-year degrees within even six years, the Clearinghouse reports, as they switch majors, flounder through required courses, and take classes they don’t need
Information universities and colleges already collect, but haven’t used in this way, can help avert such stumbles, Venit said.
“The data is so accurate that we can see the problems coming a mile away,” he said. Yet, with no profit motives and no conventional shareholders, “higher education is lagging behind other industries in the use of this.”
That’s begun to change as students, parents, and policymakers press universities to provide a better return on their investments, and as universities themselves — especially public universities, whose revenues are under strain — are forced to become more efficient.
At Georgia State — 80 percent of whose students are racial minorities, low income, the first in their families to go to college, or from other groups that struggle with college costs and bureaucracies — the six-year graduation rate had fallen to a dismal 32 percent before the university began to look at data. It’s since increased to 53 percent.
Georgia State’s president, Mark Becker, a first-generation student himself with a PhD in statistics, likens students to motorists who need occasional help steering.
“Think of going through college as driving a car and the destination of the car is graduation,” he said. “If you start drifting off the road, we want to straighten you out and keep you driving forward.”
Said Becker: “Students have complicated lives. We can’t make their lives perfect, but we can use data to help them avoid obstacles.”
That’s especially important as the students arriving on all campuses start looking more like the ones at Georgia State: poor, nonwhite, and first generation.
“A lot of these are students who are just barely able to afford college,” Renick said. “Taking the wrong course, getting a couple of Fs, losing a scholarship, wasting credit hours all can stop them from getting a degree.”
Now the university is poring over its data to determine how to predict when financial problems might force students to drop out, and offering “micro grants,” with stringent conditions, to keep them enrolled. Nine out of 10 freshmen who were offered the grants last year stayed in school.
Purdue University Calumet, whose six-year graduation rate is 31 percent, has also started using data, and improved the proportion of its students who returned this fall by 5 percentage points over the proportion who returned the fall before, up to 74 percent. That saved the university nearly half a million dollars in tuition that would have otherwise been lost, plus the cost of recruiting new students to fill those empty seats — almost five times what it paid to analyze and act on the data, the university says. Among other things, it moved up the deadline for declaring a major, having found that students who waited to do this were more likely to drop out.
“A lot of these students don’t need a big change,” said Venit. “They just might need a single conversation to square them away. But they weren’t getting a lot of guidance.”
Southern Illinois University increased its return rate by an even larger 8.3 percentage points, to 68 percent, and its revenue by more than $2 million, according to John Nicklow, who was provost when the process was begun last year. Those gains came after the university used data to identify a much larger proportion of students who needed help than was previously thought. The cost was about $100,000, part of it paid for by a grant from the Bill & Melinda Gates Foundation.
“I can’t believe it’s taken us this long to dig into this data,” said Nicklow, an engineer by training. “More of us need to do it.”
Back at SMU, Stephanie Dupaul’s collection of Magic 8 Balls has grown to 30, including a Buddha Ball, a Yoda Ball, an Excuse Ball — even an automotive diagnostic ball.
She said more universities are sure to adopt the use of data — “It’s one of those waves that’s coming; a lot of schools just haven’t caught the wave yet” — but cautions that data can’t always guarantee a very human student’s future any more than these 8 Balls can.
“We still have to remember that data alone is not always a predictor of individual destiny,” she said, “even when ‘Signs Point to Yes.’”
[Cross-posted at The Hechinger Report]