On March 18, the No Tech Criminalization in Education (NOTICE) Coalition, on behalf of 42 organizations, emailed a letter to the U.S. Department of Education and that agency’s civil rights office calling for a ban on “federal grantmaking activities that allow schools to purchase or use artificial intelligence and big data technologies to violate students’ fundamental rights, including their civil and human rights, data privacy rights, and other relevant federal legal protections.” Signatories on the document include activist organizations from across the U.S., such as the American Civil Liberties Union of Minnesota, Autistic Self Advocacy Network, Dignity in Schools Campaign and NAACP Florida State Conference.
NOTICE also asked the U.S. Department of Education to provide local guidance to K-12 districts on scientific validation methods and legal implications of AI and big data technologies.
“We are alarmed by the growing use of surveillance technologies to expand police presence in schools and expose students to greater police contact, exclusionary discipline, and school pushout,” the letter said. “We view these developments as a dangerous new chapter in the school-to-prison pipeline and mass criminalization of Black, brown, and Indigenous youth and other marginalized young people.”
Marika Pfefferkorn, who co-founded NOTICE with Clarence Okoh, said she anticipates a response from the Department of Education within a few weeks.
She said the fundamental problem with this AI-powered technology is that school leaders have limited understanding of how the algorithms work, and sometimes they can’t grasp how mistakes are made and the impact those mistakes could have on young people who are wrongly accused of breaking school rules.
“People get really excited about silver-bullet solutions,” Pfefferkorn said. “School districts themselves do not have an expertise, they [technologies] are not well vetted, and they don’t understand the range of harms. They are moving ahead of the uncertainties of technology.”
According to the NOTICE page located on the website for its parent organization, the regional entrepreneur group Twin Cities Innovation Alliance, NOTICE’s larger goal is to “build a national movement to end the use of data and technology that surveils, polices, and/or criminalizes young people and their communities.”
“These technologies include predictive policing, data-sharing agreements, facial recognition, social media monitoring, automated license plate readers, gang databases, school hardening, biometric surveillance, and student device monitoring, and so-called ‘early warning systems.’ We recognize that these technologies intensify harm in Black, Brown, Indigenous, disabled, LGBTQIA+, and otherwise marginalized communities,” the webpage says.
The latest AI-powered school security systems vary by function and can be tailored to a district’s needs. One made by Iveda, set to be installed on three Navajo tribal school campuses in Arizona, has monitoring capabilities to detect unauthorized intruders as well as smoke and fire, and can also recognize faces and vehicle license plates. The AI platform Iterate.ai recently started offering schools a free version of its threat awareness software, which can identify weapons but does not use facial recognition and doesn’t collect personally identifiable information, according to company officials.
Early warning systems, which NOTICE opposes, are used by schools to track students who are at risk of not graduating from high school on time due to chronic absenteeism and other issues. The Indiana Department of Education is planning to make its early warning dashboard available to all districts by the start of the 2024-2025 academic year. That initiative was launched after 84 districts in the state last year reported chronic absenteeism rates of 50 percent or higher.
Pfefferkorn, who is also a solutions and sustainability officer for the Twin Cities Innovation Alliance, said NOTICE members pushed back on police and government agencies in two notable cases involving school surveillance and student data. The first was in Pasco County, Fla., and the latter was in the St. Paul area of Minnesota, where Pfefferkorn said a coalition of parents and community members stopped local police, governments and schools from establishing an information-sharing network that would have harmed students.
“It’s better to have something in place based on trust of adults and students, instead of dirty data and misinformation that fuels the tool,” she said.