But in one instance, an elementary school student searched “how to die.”
In that case, Meghan Feby, an elementary school counselor in the district, got a phone call through a platform called GoGuardian Beacon, whose algorithm flagged the phrase. The system sold by educational software company GoGuardian allows schools to monitor and analyze what students are doing on school-issued devices and flag any activities that signal a risk of self-harm or threats to others.
The student who had searched “how to die” did not want to die and showed no indicators of distress, Feby said — the student was looking for information but in no danger. Still, she values the program.
“I’ve gotten into some situations with GoGuardian where I’m really happy that they came to us and we were able to intervene,” Feby said.
School districts across the country have widely adopted such computer monitoring platforms. With the youth mental health crisis worsened by the COVID-19 pandemic and school violence affecting more K-12 students nationwide, teachers are desperate for a solution, experts say.
But critics worry about the lack of transparency from companies that have the power to monitor students and choose when to alert school personnel. Constant student surveillance also raises concerns regarding student data, privacy and free speech.
While available for more than a decade, the programs saw a surge in use during the pandemic as students transitioned to online learning from home, said Jennifer Jones, a staff attorney at the Knight First Amendment Institute.
“I think because there are all kinds of issues that school districts have to contend with — like student mental health issues and the dangers of school shootings — I think they [school districts] just view these as cheap, quick ways to address the problem without interrogating the free speech and privacy implications in a more thoughtful way,” Jones said.
According to the most recent youth risk behavior survey from the federal Centers for Disease Control and Prevention, nearly all indicators of poor mental health, suicidal thoughts and suicidal behaviors increased from 2013 to 2023. During the same period, the percentage of high school students who were threatened or injured at school, missed school because of safety concerns or experienced forced sex increased, according to the CDC report.
And the threat of school shootings remains on many educators’ minds. Since the Columbine High School shooting in 1999, more than 383,000 students have experienced gun violence at school, according to The Washington Post’s count.
GoGuardian CEO Rich Preece told Stateline that about half of the K-12 public schools in the United States have installed the company’s platforms.
As her school’s designee, Feby gets an alert when a student uses certain search terms or combinations of words on their school-issued laptops. “It will either come to me as an email, or, if it is very high risk, it comes as a phone call.”
Once she’s notified, Feby will decide whether to meet with the student or call the child’s home. If the system flags troubling activity outside of school hours, GoGuardian Beacon contacts another person in the county — including law enforcement, in some school districts.
Feby said she’s had some false alarms. One student was flagged because of the song lyrics she had looked up. Another one had searched for something related to anime.
About a third of the students in Feby’s school come from a home where English isn’t their first language, so students often use worrisome English terms inadvertently. Kids can also be curious, she said.
Still, having GoGuardian in the classroom is important, Feby said. Before she became a counselor 10 years ago, she was a school teacher. And after the 2012 Sandy Hook Elementary School mass shooting, she realized school safety was more important than ever.
DATA AND PRIVACY
Teddy Hartman, GoGuardian’s head of privacy, taught high school English literature in East Los Angeles and was a school administrator before joining the technology company about four years ago.
Hartman was brought to GoGuardian to help with creating a robust privacy program, he said, including guardrails on its use of artificial intelligence.
“We thought, ‘How can we co-create with educators, the best of the data scientists, the best of the technologists, while also remembering that students and our educators are first and foremost?’” Hartman said.
GoGuardian isn’t using any student data outside of the agreements that school districts have allowed, and that data isn’t used to train the company’s AI, Hartman said. Companies that regulate what children can do online are also required to adhere to federal laws regarding the safety and privacy of minors, including the Family Educational Rights and Privacy Act and the Children’s Online Privacy Protection Rule.
But privacy experts are still concerned about just how much access these types of companies should have to student data.
School districts across the country are spending hundreds of thousands of dollars on contracts with some of the leading computer monitoring vendors — including GoGuardian, Gaggle and others — without fully assessing the privacy and civil rights implications, said Clarence Okoh, a senior attorney at the Center on Privacy and Technology at the Georgetown University Law Center.
In 2021, while many schools were just beginning to see the effects of online learning, The 74, a nonprofit news outlet covering education, published an investigation into how Gaggle was operating in Minneapolis schools. Hundreds of documents revealed how students at one school system were subject to constant digital surveillance long after the school day was over, including at home, the outlet reported.
That level of pervasive surveillance can have far-reaching implications, Okoh said. For one, in jurisdictions where legislators have expanded censorship of “divisive concepts” in schools, including critical race theory and LGBTQ+ themes, the ability for schools to monitor conversations including those terms is concerning, he said.
A report by the Electronic Frontier Foundation, a nonprofit digital rights group based in San Francisco, illustrates what kinds of keyword triggers are blocked or flagged for administrators. In one example, GoGuardian had flagged a student for visiting the text of a Bible verse including the word “naked,” the report said. In another instance, a Texas House of Representatives site with information regarding “cannabis” bills was flagged.
GoGuardian and Gaggle both also dropped LGBTQ+ terms from their keyword lists after the foundation’s initial records request, the group said.
But getting a full understanding of the way these companies monitor students is challenging because of a lack of transparency, Jones said. It’s difficult to get information from private tech companies, and the majority of their data isn’t made public, she said.
DO THEY WORK?
Years before the 2022 shooting at Robb Elementary School in Uvalde, Texas, the school district purchased a technology service to monitor what students were doing on social media, according to The Dallas Morning News. The district sent two payments to the Social Sentinel company totaling more than $9,900, according to the paper.
While the cost varies, some school districts are spending hundreds of thousands of dollars on online monitoring programs. Muscogee County School District in Georgia paid $137,829 in initial costs to install GoGuardian on the district’s Chromebooks, according to the Columbus Ledger-Enquirer. In Maryland, Montgomery County Public Schools eliminated GoGuardian from its budget for the 2024-2025 school year after spending $230,000 annually on it, later switching to Lightspeed, according to the Wootton Common Sense.
Despite the spending, there’s no way to prove that these technologies work, said Chad Marlow, a senior policy counsel at the American Civil Liberties Union who authored a report on education surveillance programs.
In 2019, Bark, a content monitoring platform, claimed to have helped prevent 16 school shootings in a blog post describing their Bark for Schools program. The Gaggle company website says it saved 5,790 lives between 2018 and 2023.
These data points are measured by the number of alerts the systems generate that indicate a student may be very close to harming themselves or others. But there is little evidence that this kind of school safety technology is effective, according to the ACLU report.
“You cannot use data to say that, if there wasn’t an intervention, something would have happened,” Marlow said.
Computer monitoring programs are just one example of an overall increase in school surveillance nationwide, including cameras, facial recognition technology and more. And increased surveillance does not necessarily deter harmful conduct, Marlow said.
“A lot of schools are saying, ‘You know what, we’ve $50,000 to spend, I’m going to spend it on a student surveillance product that doesn’t work, instead of a door that locks or a mental health counselor,’” Marlow said.
Some experts are advocating for more mental health resources, including hiring more guidance counselors, and school policies that support mental health, which could prevent violence or suicide, Jones said. Community engagement programs, including volunteer work or community events, also can contribute to emotional and mental well-being.
But that’s in an ideal world, GoGuardian’s Hartman said. Computer monitoring platforms aren’t the only solution for solving the youth mental health and violence epidemic, but they aim to help, he said.
“We were founded by engineers,” Hartman said. “So, in our slice of this world, is there something we can do, from a school technology perspective that can help by being a tool in the toolbox? It’s not an end-all, be-all.”
Statelineis part of States Newsroom, a national nonprofit news organization focused on state policy.
©2024 States Newsroom. Distributed by Tribune Content Agency, LLC.