Despite a growing need for K-12 mental health resources, a report from Student Privacy Compass, a website from the nonprofit think tank Future of Privacy Forum, suggests that the increased use of monitoring programs that track browsers for keywords relating to self-harm could do more harm than good.
According to the report, more than 15,000 schools use the monitoring program Securly and 10,000 schools use GoGuardian, the latter of which saw a 60 percent increase in users during the pandemic. In another June 2021 survey from the Center for Democracy and Technology, 81 percent of K-12 teachers reported using monitoring tools in their schools.
Anisha Reddy, policy counsel with the Future of Privacy Forum and one of the report’s authors, said she was unable to find “any independent research” into the efficacy of K-12 monitoring programs, which raises the question of whether schools themselves are the testing grounds for these programs.
The report noted these programs sometimes flag keywords related to sexual orientation or gender identity, such as “gay” and “lesbian,” as possible signs of bullying, but that has the potential to expose LGBT students in ways that could be "harmful, discriminatory or disparate," or otherwise considered invasive. Programs have also flagged discussions among students about media or music deemed suspicious, including the novel To Kill A Mockingbird, or keywords used in the context of research assignments.
Reddy said investments in these programs may be funneling money away from resources proven to help students, such as additional counselors trained in children’s mental health, while cultivating an atmosphere of distrust that could prove counterproductive.
“Introducing this kind of technology without that kind of support can only serve to potentially harm students,” she said. “Technology might not be the answer if you don’t have support for the students you identify.”
According to the report, most monitoring services employ algorithms that detect keywords based on simple natural language processing, as well as other modes of artificial intelligence that examine context.
Jeff Patterson, CEO and founder of the student monitoring program Gaggle, said the company’s monitoring technology has saved "more than 1,400" student lives. He said the software only flags “very specific indicators that a child is at risk of suicide, self-harm or violence.”
“If our technology flags something concerning, we have a team of human reviewers who then analyze the content to determine context and decide whether or not the content merits outreach to the school district’s emergency contact,” he said in an email to Government Technology. “At the end of the day, we believe the value of saving a child’s life should outweigh any concerns about a child’s schoolwork being monitored for threats of violence or self-harm.”
The report said companies like Gaggle tend to market themselves as ways to prevent school violence, and Reddy argued that talk of safety and violence in the context of mental health issues can lead to unwanted police involvement — a key concern among mental health and disability rights communities.
Erica Darragh, a campaigner with the digital privacy organization Fight for the Future, shared these concerns and said schools should ban the use of monitoring programs, whether for self-harm or academic proctoring.
Drawing from a 2016 report by the American Civil Liberties Union indicating that 14 million students attended schools with police but no counselor, nurse, psychologist or social workers, Darragh said such programs could serve to criminalize students struggling with mental health issues and feed them into the “school-to-prison pipeline.”
“The potential for exacerbating harm, not only on the individual level but also the institutional level, is much greater than perceived positive outcomes,” she noted.
Linnette Attai, director of privacy initiatives at the Consortium for School Networking (CoSN), suggested that schools and policymakers should clarify intervention policies and how these tools can be used, with these concerns in mind.
"Before engaging with any technology company, establish the goal. What problem are you trying to solve? How will you solve for it while protecting your students from harm or other unintended consequences?” she said. “There's a line between threat monitoring and profiling or targeting. Document the policies and procedures that will keep you on the right side of that before moving forward.”