Since March, nine students have been identified through GoGuardian’s Beacon software as having a severe mental health crisis and were taken to an emergency room, according to Stacey Davis, the city schools coordinator of media and instructional technology. In at least two of those cases, the students had never had any mental health care.
Two reports released in the past month question the use around the country of such technology to track students, warning that it might be used for disciplinary purposes, unintentionally out LGBTQ students or squash student expression. The studies also point out that, unlike for wealthier students who may own laptops, the behavior of economically disadvantaged students may be tracked more frequently because the school-owned laptop is their only device.
“Privacy and equity was not being considered as much as it needs to be,” said Elizabeth Laird, director of equity in civic technology at the Center for Democracy and Technology in Washington and co-author of one of the reports. “Student activity monitoring is quite widespread.”
Baltimore County uses GoGuardian software, although not the company’s Beacon product, to monitor for possible self-harm. School officials said their approach to suicide prevention focuses on building relationships between students and school staff.
Harford, Howard and Carroll counties said they do not monitor student devices for warning signs of self injury. Anne Arundel County did not respond to questions about any monitoring.
In Baltimore City, on weekends and at night when school psychologists or social workers aren’t available, school police officers have been sent to students’ homes to check on them after alerts from the software, as first reported by The Real News Network.
GoGuardian did not respond to questions about what keywords its software uses to identify students who might be planning suicide.
Davis said when a message comes in to school police, the agency’s dispatcher first tries to call a family. If they don’t get an answer, a school police officer is sent to the home to talk to the family — known as a “wellness check.”
School police Chief Akil Hamm said his officers go to the door of a home, show the parents a copy of the alert and what their child typed. Then, he said, “We ask the parents if we can lay eyes on the student.”
School police are trained in trauma-informed care, behavioral crisis response, and to recognize signs of mental health crisis and how to respond.
Hamm said parents are usually grateful to have been alerted.
“As they talk, they work with the guardian to determine if the alert is serious enough to require the student be taken for a mental health evaluation at Hopkins. If not, they will leave information about MD 211′s crisis line or recommend a visit with the family doctor or a walk-in clinic,” Davis said.
Police have only once taken a student to the emergency room, and it was at the request of the parents, Hamm said. Students are not handcuffed, Hamm said, and officers don’t demand to enter the homes.
The information is passed on to the principal of the student’s school, but school police don’t keep a record of it and it’s not entered into a student’s file.
Having school police involved concerns Larry Simmons, president of the system’s Parent Community Advisory Board. Having a school police officer arrive at the door may look punitive and not supportive. Officers carry firearms when they are off school grounds.
“School police are not social workers,” he said.
In general, Simmons said, “I would say that this is really disturbing. You have not only monitored the kid, but the family, as well.”
The monitoring comes out of a need for systems to protect students from inappropriate material on the Internet, such as pornography. Inside schools, districts have firewalls that prevent students from using sites such as Twitter, Facebook and YouTube.
When the coronavirus pandemic hit, many school systems supplied laptops so students could study at home. To make sure they focused on their work, districts purchased monitoring software. In some cases, like Baltimore City, the systems didn’t set policies on how much of a student’s work could be seen and monitored.
Davis said the city purchased GoGuardian, which has allowed teachers to watch what students are doing on their laptops while they were teaching remotely.
GoGuardian, as well as Gaggle, another software company, sells an additional service that uses artificial intelligence to monitor student searches and, in some cases, their writing for evidence that they may harm themselves. Gaggle also provides teletherapy for students by counselors.
Since March, the city schools have gotten 786 alerts from Beacon. Of those, clinicians responded 401 times, while school police went to homes 12 times. In addition to the nine students referred to an emergency room, 12 students were referred to a crisis response center. The races and ages of the affected children were unavailable.
“Through the expansion of virtual learning, a lot of things have to be rolled out very quickly, some have unintended consequences,” said Zach Taylor, a representative of the Baltimore Teachers Union. “Good intentions and policies can have adverse effects.”
He said the school system should have an open discussion about the use of the technology.
Holly Wilcox, a professor at the Johns Hopkins Bloomberg School of Public Health, whose research focuses on suicide, said Hopkins emergency room doctors became interested in the tool when three students arrived in a short period of time in the emergency room needing mental health care.
Wilcox said the doctors contacted her and she began checking whether the use of Beacon is finding children who might not have been treated otherwise. She said she is in the early stages of looking into it and has contacted other hospitals in the region.
“I see the risks and the potential privacy concerns people have,” she said. But, “if it is going to save someone’s life and get them the help they need,” it is important to have in place.
Wilcox said she wants to determine if students get the help they need, whether their problems could have been spotted before they reached a crisis and who are the professionals that will follow up with their care.
Deborah Demery, president of the Baltimore City Council of PTAs, said she is not concerned by the monitoring.
“As far as being concerned personally, I feel much better they are monitoring and they are able to get those kids the help. It is a safeguard and it is working,” Demery said.
Sharon A. Hoover, co-director of the National Center for School Mental Health at the University of Maryland National School Mental Health Center would agree, but she believes there should be guardrails on the use of the technology.
“There is some good intentions behind the technology and, at the same time, they are raising questions and concerns around privacy,” said Hoover, who is also a professor of child and adolescent psychiatry. She said society must balance the risks of invading privacy and the loss of that for the common good.
“Do I think there is some positive potential in protecting students from suicide? Yes, I do.”
©2021 Baltimore Sun. Distributed by Tribune Content Agency, LLC.