IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Enrollment Algorithms Raise Equity Concerns in Higher Ed

While designed to help colleges and universities boost revenue and enrollment, algorithms that decide how to apportion financial aid could be unfairly filtering out applicants and reducing the amount of available aid.

graphic with silhouette and algorithmic bias
Shutterstock
Colleges and universities have been using algorithmic programs for decades to streamline the recruitment and admissions process, often using data to predict what amount of financial aid would most likely persuade a prospective student to enroll. While helpful in the budgeting process for an upcoming school year, enrollment management algorithms are drawing criticism that their unregulated use may be filtering out low-income, female and non-white applicants who are more likely to need larger aid packages.

There is already precedent for such concerns, dating back to some of the first use cases of enrollment algorithms in higher education, according to a recent report from the policy think tank Brookings Institution. The report noted an example as far back as the 1970s, when an algorithm at St. George’s Hospital Medical School in London intended to make admissions fairer but actually weeded out women and students of color before interviewing them.

Still, the use of predictive data programs for higher ed enrollment management has steadily risen in recent years, with the intention of helping admissions offices working to recruit students and meet enrollment goals. According to the report, schools run the risk of replicating similar scenarios and access hurdles, despite general success with using these programs to meet overall enrollment goals and determining course placement for new students.

According to a study from the tech-focused nonprofit Educause, noted in the Brookings report, about 75 percent of higher-ed institutions were already using predictive data analytics to choose student applicants by 2015, representing a 15 percent jump over the previous decade and making it the most common form of data analytics used by universities. As a result of the growing popularity of enrollment algorithms, outside tech vendors that create and implement them, such as EAB and Ruffalo Noel Levitz, have served over 100 and 300 institutions, respectively.

The Brookings report noted that implementing such systems to boost overall enrollment and improve fiscal planning could prove counterproductive to diversity goals without the proper oversights in place. It said companies who make these systems often advertise their services as a way to increase revenues from tuition, and it works. The report pointed to a 2019 study from the University of Washington that found, for one unnamed university, the use of algorithms to disburse financial aid helped increase enrollment yields for out-of-state students by over 23 percent.

“Higher education is already suffering from low graduation rates, high student debt, and stagnant inequality for racial minorities—crises that enrollment algorithms may be making worse,” wrote Alex Engler, policy analyst and author of the Brookings Institution report. “Unfortunately, the widespread use of enrollment management algorithms may also be hurting students, especially due to their narrow focus on enrollment.”

While not considered inherently harmful in and of themselves, the use of predictive data analytics to identify a student’s ability to pay fees could open doors to “subtle channels for algorithmic discrimination” against low-income students and students of color who already have less access to universities, Engler wrote. The problem lies mainly with how the data could be used to weed them out. Considering the reliance of many universities upon tuition fees and growing enrollment, Engler saw little incentive for schools to reign in data analytics that determine tuition yields — and feed the “enormous pressure on financial aid offices” to recruit students without the need to award a high number of scholarships. What’s more, Engler found that algorithms and their applications have had little to no human oversight on either the institutional or governmental level.

Engler recommended that schools hire personnel to evaluate the quality of algorithms, as well as internal data scientists who can check algorithmic specifications that could favor white and high-income students. He said institutions should look closely at any historical enrollment data sets used to build algorithms, as well as at the context of a student’s ability to pay fees.

“State policymakers should consider the expanding role of these algorithms too, and should try to create more transparency about their use in public institutions,” he wrote. “More broadly, policymakers should consider enrollment management algorithms as a concerning symptom of pre-existing trends towards higher tuition, more debt and reduced accessibility.”

Described in a report on the pros and cons of algorithms by the Pew Research Center, there is no shortage of examples that show how biases and functional errors undermine predictive data tools generally. For example, it mentions a Twitter-developed chatbot that repeated racist and misogynistic epithets it “learned” from users, and the inability of Facebook algorithms to tell real news from fake news.

Neil Heffernan, a computer science professor at Worcester Polytechnic Institute and lead developer of the AI-based ed-tech program ASSISTments, said the problem with such analytics lies in the intention, and the fact that many students are unaware of how their data is used to determine aid offers. Heffernan believes machine learning is still unreliable for complex functions that can have far-reaching effects, and he said their use should be more regulated to ensure they’re increasing access to education rather than impeding it.

“There are ways to use [algorithms] to do something good … to learn what we can do to get people to take advantage of scholarships,” he later said, pointing to programs that streamline the Free Application for Federal Student Aid (FAFSA) process.

Matt Artz, a design anthropologist and instructor at Drew University in New Jersey who studies how algorithms like Spotify’s prioritize content, said higher-ed algorithms can work similarly by favoring people with more social and financial capital. He said algorithms often mirror human prejudices, and designing helpful algorithms must involve substantial input from stakeholders — students, universities, ed-tech developers and others — to mitigate biases before they produce real-world consequences, rather than after.

“No technology is neutral. Algorithms can be helpful, or they can be harmful,” he said. “If we move the process to the start and involve stakeholders in crafting the research agenda and involve them in a participatory model, we will produce outcomes that are naturally more equitable.”
Brandon Paykamian is a former staff writer for the Center for Digital Education.