IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

CITE23: AI Tools Raise New Legal Questions for K-12

Some legal questions around generative AI in schools have yet to be resolved, but in general, schools must vet their vendor contracts carefully and get parental permission for students to use the technology.

Robot hand banging a digital gavel
Shutterstock/Beautrium
SACRAMENTO — Get consent. Never enter personal identifiable information. Know the latest terms and conditions of software tools. Get consent. Train staff and students. Get consent. And just in case it wasn't clear: Get consent.

Education lawyer Gretchen Shipley had more ground to cover and questions to answer than she had time for on Tuesday, giving a 50-minute talk on the legal implications of generative AI (GenAI) for education to a room full of school technologists at the annual California IT in Education Conference in Sacramento. But for all the looming questions in these unregulated Wild West days of GenAI, she did not think it was too soon to say safety reviews of school technology tools, regular vetting of terms and conditions, and certainly student and parent consent agreements are essential places to start.

To illustrate legal problems posed by emerging AI tools, Shipley took an example from her own professional experience: note-taking apps.

“In two or three weeks’ time, it has exploded. I can’t believe how cool and efficient it is. However, my conversations with you all are protected by attorney-client privilege,” she said of a hypothetical scenario in which her education law firm, Fagen Friedman & Fulfrost, might represent attendees. “And when you turn on [an audio recording], there are pop-ups, so I notice and I say, ‘Hey, this is actually a confidential conversation, I don’t think we can have Zoom recording it without [permission].’ The person who holds that privilege is the school district, but then, are you the person to waive that privilege? I don’t know. It’s a little dicey … Yes, the note-taking apps are cool, but notify the person. Get their consent.”

Shipley said a wide range of technology tools now being adopted by schools have privacy implications, from license plate readers to security systems with facial recognition, but she believes the biggest emerging shift involves parental consent for open AI tools. She said legal exceptions in the Family Educational Rights and Privacy Act (FERPA) allow teachers to use educational software without parental permission as long as it has a legitimate educational interest and limits the resharing of information. However, GenAI tools may exceed these exceptions, particularly for students under 13, depending on whether the tools are from an open or closed environment and where their data goes.

Shipley used the example of the state of California piloting, through a handful of districts, an emotionally intelligent chatbot that high schoolers could access on their phones. Who is on the receiving end of personal messages sent by students? Who is responsible if a student sends messages about self-harm? Who is liable for how the chatbot responds? It is a district’s responsibility to sort out these questions before exposing students and families to such tools.

“The district’s responsibility will come into play if the district made it available to them or made it a requirement … If the school assignment was ‘Use ChatGPT’ and it’s a room of fifth graders, that’s a problem. And even if they’re over 13, you still can’t require it, but you can say, ‘Well, it has a pop-up where it makes you consent … So do that, students,’” Shipley said. “I do think that if they’re using it on their own for their homework and the district is not requiring it, that is [legally OK] for the district, but I think public perception is, ‘You made this tool available to our students, but it turns out it completely violates their privacy.’ You’re going to get not just one parent coming after you, but more like a class action, which will be costly.”

Shipley said the same applies with safety and security systems that use license plate readers or facial recognition, which are legal in California but may cause public blowback, forcing a district to back out of a contract early and costing it a ton of money.

She added that the particular need for parental consent with GenAI means districts can’t allow vendors to sneak GenAI into existing tools through a software update unannounced, or without revisiting and possibly renegotiating terms and conditions with the district.

She summarized best practices for districts using AI tools in four points:

  • Have a system in place to vet terms and conditions of technology tools.
  • Ensure the district will be notified if a vendor updates its terms and conditions, and know whether new conditions will trigger parental consent requirements.
  • Train staff and students not to enter any personal identifiable information into a GenAI tool.
  • Determine whether the software monitors or responds to threats of harm, to self or others, and consider adding a provision in the contract limiting district liability and specifying procedures for response.

Shipley added that districts should also have clear expectations for staff members who want to introduce AI tools to their class or school.

“[P]arental consent will never have 100 percent participation, and it will be administratively challenging,” she said. “Hopefully there will be legislation that will clean this up and put more parameters on our software companies to help us navigate this, but I think right now, that’s the most protective step for your school districts.”
Andrew Westrope is managing editor of the Center for Digital Education. Before that, he was a staff writer for Government Technology, and previously was a reporter and editor at community newspapers. He has a bachelor’s degree in physiology from Michigan State University and lives in Northern California.