College advisors want to help students stay on track to finish their programs. But on many campuses, the high student-to-advisor ratios make it difficult, if not impossible, to identify and support every student who needs extra guidance.
Over the years, higher ed institutions have deployed data analytics and technology tools to help advisors in their work. One of the more high profile efforts has been supported by the Bill & Melinda Gates Foundation, which in 2013 began providing grant funding to colleges and companies for its Integrated Planning and Advising for Student Success initiative, or iPASS for short. The ultimate goal is to help more students persist and finish college.
But when it comes to achieving that goal, iPASS efforts have yet to pass muster, according to a new report examining these efforts at three institutions. As the authors wrote: iPASS has “not yet produced discernible positive effects on students’ academic performance.” At the root of the problem, flawed data, skeptical advisors and other implementation hiccups with iPASS efforts failed to move the needle for students, the report found.
Conducted by the Community College Research Center at Teachers College, Columbia University and published by MDRC, an education research nonprofit, the study explored the effects of iPASS interventions at Fresno State in California, the University of North Carolina at Charlotte, and Montgomery County Community College in Pennsylvania.
Across these three schools, the study looked at how 8,011 students, split roughly evenly between an iPASS group and a control group, fared across two semesters.
At Fresno State and UNCC, “the iPASS enhancements produced no statistically significant effects on students’ short-term educational outcomes,” wrote the authors. Perhaps more deflating: “Across the three institutions, large proportions of students who were identified as being at high risk still earn Ds or Fs, or do not persist into subsequent semesters of college.”
The study, also funded by the Gates Foundation, details how the three institutions used iPASS grant money to implement new tools and strategies to increase the frequency of communication between students and advisors, and leverage data to make those meetings more effective. Often, this meant deploying tools used by school officials to help students plan their course load and schedule regular check-ins with tutoring, mental health and other support services.
All three institutions also placed a hold on students’ ability to register for classes until they met with an advisor. With such measures in place, more students did get in touch with their advisors.
But some of these tactics could also backfire.
At Montgomery County Community College, the report suggested that the course registration hold may have deterred students who would normally have signed up. In other words, instead of meeting with an advisor, some students simply didn’t register for classes at all. Montgomery students in the iPASS group earned fewer credits than their peers in the control group, and the authors wrote that “the mechanics of the registration hold may have negatively affected enrollment in seven-week courses that began midsemester.”
Another set of tools involved the use of predictive analytics, in the hopes of creating early-warning systems that could alert advisors if a student showed signs of struggling based on academic or attendance data.
But this technology, which often uses historical student data to predict how current one will fare, presented challenges as well. At UNCC, risk scores assigned to students were inaccurate due to errors introduced by the software. As a result, advisors were reluctant to rely on these predictions to inform their work.
At Montgomery County Community College, which used tools built by Civitas Learning, Hobsons and other companies, advisors “disagreed with the risk assessment of the predictive analytics tool and reported concerns that some students who seemed to be performing well had been determined to be at risk,” the authors wrote. Specifically, students who had high GPAs and were close to graduating were flagged as being at risk of dropping out.
The findings from this report are unfortunately not uncommon as others have also tried—and struggled—to effectively implement early-alert systems. Tallahassee Community College in Florida is now trying its third system after faculty said they “hated” previous efforts. Elsewhere, programs based around “nudging,” or touching base with students more frequently to keep them engaged, can have the opposite effect and push students to leave school. These challenges have raised questions about the feasibility of using purely data-driven approaches to quantify whether a student is likely to succeed or fail.
A broader challenge surfaced in the report is the disconnect between introducing and implementing new tools, and getting students and faculty to adjust their habits to make the most of them. It’s one thing for students to receive more notifications about how they’re doing. But it’s another for them to adjust their schedules to make time for additional meetings. Advisors also said they needed advising sessions to be longer, so that they could fully make use of the tools and data at their disposal.
Despite these challenges, “each of the institutions made progress integrating technology and data with advising, getting more students in to see advisers, and expanding the content of advising sessions,” the authors wrote.
A follow-up report is expected next year, which will go into greater detail about the longer-term student outcomes. In all, 26 higher-ed institutions received grants, each worth up to $225,000, from the Gates Foundation and The Leona M. and Harry B. Helmsley Charitable Trust to implement iPASS programs. Other education nonprofits, including Achieving the Dream and EDUCAUSE, provided technical assistance and support.