People v. Predictive Analytics

In the coming months, I’ll be inviting higher education professionals to participate in an online assessment mechanism designed to answer one question: Can people predict student withdrawal behavior more accurately than predictive analytics algorithms?

For almost two years now, I have been working with small groups of people in my research into the efficacy of higher education analytics. Specifically, I have been trying to determine if the return on investment is worth the time, energy, and tax dollars required to properly implement a predictive analytics system for college students. My research has yielded some pretty striking information; in 13 of the past 15 informal tests I have conducted, people outperformed predictive analytics algorithms by a median accuracy 10-15%.

The algorithms ranged from least-squares linear regression to decision trees, using R to develop the models that made the predictions that were then compared to those of the human subjects. It was, for all intents and purposes, a human vs. AI scenario, and my tentative results have been in favor of the humans.

I don’t have to tell you about the implications of this. If it turns out that people are generally as accurate or more accurate than the expensive software that institutions so often cite in their accreditation reports as being their solution to enrollment and management crises, how might we frame future college planning and analytics expenditures if the whole time people performed just as well if not better than algorithms?

Please be on the lookout for an email from me with an invitation to participate in the coming months. Additionally, I’ll be broadcasting on Twitter and Facebook (and of course on my website) inviting any and all to attend.