For years, Stephanie Dupaul would jokingly consult her collection of Magic 8 Balls when students asked her questions such as, “Will I get an A in that class?” Now, she can give them an answer far more accurate than anything predicted by a toy fortune-teller.
Dupaul, the associate provost for enrollment management at Southern Methodist University, is one of a growing number of university administrators consulting the performance data of former students to predict the outcomes of current ones. The little-known effort is being quietly employed by about 125 schools around the U.S., and often includes combing years of data covering millions of grades earned by thousands of former students.
It’s the same kind of process tech behemoths like Amazon and Google employ to predict the buying behavior of consumers. And many of the universities and colleges that are applying it have seen impressive declines in the number of students who drop out, and increases in the proportion who graduate. The early returns are promising enough that it has caught the attention of the Obama Administration, which pushed for schools to make heavier use of data to improve graduation rates at a White House higher education summit last week.
The payoff for schools goes beyond graduation rates: tracking data in this way keeps tuition coming in from students who stay, and avoids the cost of recruiting new ones, which the enrollment consulting firm Noel-Levitz estimates is $2,433 per undergraduate at private and $457 at four-year public universities.
“It’s a resource issue, it’s a reputational issue, it does impact — I’ll say it — the rankings” by improving graduation rates, Dupaul says.
At SMU, for instance, data analysis showed that students who applied early in the admissions process were more likely to ultimately earn degrees. So were those who visited the campus before enrolling, joined a fraternity or sorority, or registered for a higher-than-average number of classes.
From this and other knowledge, the university has built a predictive algorithm that can gauge the probability that a student will finish school, and prop up those who might not by sending academic advisors or deans to intervene.
Other universities also use detailed data to make sure students stay on track once they’ve arrived. Georgia State, for instance, has analyzed 2.5 million grades of former students to learn what may trip up current ones. That early-warning system, begun in 2012 to address a lower-than-the-national-average graduation rate, triggered 34,000 alerts last year about students who may have been in trouble, but didn’t know it yet.
It works by identifying risk patterns that can help catch students before they fall. For example, Georgia State’s data shows that students’ grades in the first course in their majors can predict whether or not they will graduate. Eighty-five percent of political science majors who get an A or B will earn degrees, but only 25% of those who score a C or lower will.
“What we used to do, and what other universities do, is let the C student go along until it was too late to help them,” says Timothy Renick, Georgia State’s vice president for enrollment management and student success. “Now we have a flag that goes off as soon as we spot a C in the first course.”
That student is invited to meet with an advisor and given the option of switching majors before spending more time and money on a losing proposition.
The university also uses its predictive algorithm to channel incoming freshmen with higher risk factors — like those who come from high schools where earlier graduates have been poorly prepared — into a seven-week summer session. Nine out of 10 of these students make it to the end of the first year, more than their classmates who entered without red flags.
And the analysis isn’t limited to first year students. Last year, some 2,000 Georgia State upperclassmen were hauled in for one-on-one sessions with an advisor when they signed up for courses that didn’t satisfy requirements for their majors — which the data showed would probably derail them — and moved to classes that did.
“Most students, when they take classes that don’t apply to their program, it’s not because they’ve always wanted to take a course in Greek philosophy,” says Renick. “It’s because they don’t understand the maze of rules that big institutions like Georgia State have created. And when they go off course, it’s a difference between graduating and not graduating.”
The university also uses 12 years of data from former students to nudge current ones toward majors that track more closely with their academic strengths, thereby increasing their chances of graduating.
“It’s a really simple process,” Renick says, “but it’s the kind of thing that higher education hasn’t been doing.”
Despite the promising early returns, most institutions have not embraced predictive data. Only about 125 of the more than 4,000 degree-granting postsecondary institutions are using data in this way, according to the Education Advisory Board, a firm that helps Georgia State and other schools run such programs.
More will sign on, experts say, because it can do as much for the bottom line as it does for students. For every 1 percentage point improvement in the proportion of students data tracking keeps from dropping out, Renick says, Georgia State keeps $3 million in tuition and fees that would have otherwise been lost. So far, that rate has increased by five percentage points since the university started tapping this data two years ago, meaning it has more than recouped the $100,000-a-year cost of running the system and the $1.7 million per year it takes to pay an extra 42 advisors hired to help the students it predicts might fall between the cracks.
“It’s no longer just a moral imperative. It’s a financial imperative,” says Ed Venit, a senior director at the Education Advisory Board. “The students who are on their campuses now, they have to keep them around, hopefully ’till graduation.”
Yet graduation rates overall are down, not up, since 2008, according to the National Student Clearinghouse. Only 55% of students earn their two- or four-year degrees within even six years, as they switch majors, flounder through required courses, and take classes they don’t need.
To Venit, analyzing that information — which schools already collect — can help avert such stumbles. “The data is so accurate that we can see the problems coming a mile away,” he says. “Higher education is lagging behind other industries in the use of this.”
That’s begun to change as students, parents, and policymakers press universities to provide a better return on their investments, and as universities themselves — especially public schools, whose revenues are under strain — are forced to become more efficient.
At Georgia State — where 80% of students are racial minorities, low-income, the first in their families to go to college, or from other groups that often struggle to graduate— the six-year graduation rate had fallen to a dismal 32% before the university began to look at data. It’s since increased to 53 percent.
“Think of going through college as driving a car and the destination of the car is graduation,” says Mark Becker, Georgia State’s president, a first-generation college student who went on to earn a PhD in statistics. “If you start drifting off the road, we want to straighten you out and keep you driving forward.”
Such aid is becoming increasingly important as the students arriving on campuses look more like the ones at Georgia State: less affluent, nonwhite, and often the first in their families to attend college.
“A lot of these are students who are just barely able to afford college,” Renick says. “Taking the wrong course, getting a couple of Fs, losing a scholarship, wasting credit hours all can stop them from getting a degree.”
Now the university is poring over its data to determine how to predict when financial problems might force students to drop out, and offering “micro grants,” with stringent conditions, to keep them enrolled. Nine out of 10 freshmen who were offered the grants last year stayed in school.
At Purdue University Calumet, where only 31% of students graduate in six years, 74% of students returned this fall — a 5% improvement over the year before. The gain preserved nearly $500,000 in tuition, and saved the school the expense of recruiting new students to fill those empty seats — an amount worth almost five times what the university says it paid to analyze and act on the data.
Southern Illinois University increased its return rate by an even larger 8.3 percentage points, to 68%, and its revenue by more than $2 million, according to John Nicklow, who was provost when the process was begun last year. Those gains came after the university used data to identify a much larger proportion of students who needed help than was previously thought. The cost was about $100,000, part of it paid for by a grant from the Bill & Melinda Gates Foundation.
“I can’t believe it’s taken us this long to dig into this data,” says Nicklow, an engineer by training. “More of us need to do it.”
Sitting amid her collection of 30 Magic 8 Balls at SMU, Stephanie Dupaul calls predictive data “one of those waves that’s coming. A lot of schools just haven’t caught the wave yet” But she cautions that even the best algorithms can sometimes be about as precise as the toys that line her desk.
“We still have to remember that data alone is not always a predictor of individual destiny,” she says, “even when ‘Signs Point to Yes.’”
This story was produced by The Hechinger Report, a nonprofit, independent news website focused on inequality and innovation in education.