“Teachers and AI need to be co-creators,” Sharad Sundararajan, co-founder and chief software officer at New York-based AI-focused tech company Merlyn Mind, said in the discussion. “They need to be co-thought leaders for continuous dialog to see what works and what doesn’t.”
The webinar, “How Can AI Improve Educational Outcomes in the United States,” was hosted by the Center for Data Innovation and moderated by CDI Policy Analyst Gillian Diebold, who co-authored the nonprofit’s report last month. In addition to Sundararajan, panelists included Jeremy Roschelle, executive director of learning sciences research at the nonprofit Digital Promise in Washington, D.C.; Michelle Zhou, co-founder and CEO of California-based AI chatbot company Juji; and Lynda Martin, director of learning strategy for strategic solutions at the educational publishing company McGraw Hill.
While she is intrigued by the potential of AI, Martin stressed that teachers still need to be the ones making decisions, and that education shouldn’t be entirely reliant on the automation of the technology.
“There has to be validation to what the AI is telling you and what you can see with your own eyes,” Martin said. “Nothing should be (fully) automated. There should be involvement with a human. As they work with the system more and more, that can be more automated, but there should always be a human touch to it. If not, there will be problems.”
The CDI report said that AI can bring a more personalized learning experience into the classroom, allowing kids who are behind to catch up and students who are ahead to receive more challenging coursework. The potential of AI, the report said, is increased engagement, closed learning gaps and the reduction of a teacher’s workload. That said, Jeremy Roschelle of Digital Promise indicated he isn’t entirely sold, and more challenges may arise.
“I’m suspicious of individualized learning,” he said while noting that his nonprofit aims to create conversations about bringing AI into the classroom. “If you apply this on a massive scale, there would be a larger learning gap. It brings up lots of ethical problems.”
Michelle Zhou likened AI to the digital divide, where many underserved communities are suffering from a learning gap that has been exacerbated by the COVID-19 pandemic.
“If we don’t democratize in AI for everyone to use it, we create a huge AI divide, putting many at a disadvantage,” she said.
Roschelle and Sundararajan stressed the importance of getting everyone connected, and working with each district based on their capabilities to ensure they get professional development to utilize AI in the classroom. Roschelle said organizations and policymakers need to think differently about how to work with low-income schools and help them achieve their goals. Sundararajan said that his company, Merlyn Mind, is building software that can be tailored to a specific classroom's needs and configurations. He said working with the educators is what makes most sense when incorporating AI into a classroom setting.
“You still need a teacher to direct the class with the use of the AI, and not automate the system,” Sundararajan said. “We are in no way replacing the teacher.”
Zhou concurred, saying the addition of AI to existing school processes can yield new insights about the students and help teachers do their jobs, as long as the user is knowledgeable and involved.
"It should be a tool to help decision-making, but not make the decision for you,” she said.