IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Opinion: Conflicting Views on AI Surveillance Cameras in Schools

Three Boulder, Colo., residents share their thoughts on the prospect of putting artificial intelligence-powered cameras in K-12 schools, weighing the pros of security and the cons of surveillance differently.

SHUTTERSTOCK_FACIAL_RECOGNITION
Shutterstock
(TNS) — Members of our Community Editorial Board, a group of community residents who are engaged with and passionate about local issues, respond to the following question: AI surveillance cameras with facial recognition are already in use in some Colorado schools. Many more could adopt the technology next year. Your take?

John Wooden, the legendary UCLA basketball coach, once said, “The true test of a man’s character is what he does when no one is watching.” Generally, people behave better when they know someone is watching. They are less likely to commit a crime or bully a fellow student if there is evidence of their malfeasance. One point for AI cameras.

Mark Twain preached, “Sing like no one is listening, love like you’ve never been hurt, dance like no one is watching, and live like it is heaven on earth.” A great dictum for us to strive for, but few of us succeed. My private dance moves are something to behold, but I’d be mortified to have them captured on video.

When my son Derek was 8 years old he played tackle football. He was the fastest kid on the team and therefore played tailback. After being tackled in one of his first practices, he skipped back to the huddle. He was a joyful, unselfconscious kid. The coach yelled, “No skipping in football!” Derek was a star all season long, but he didn’t skip anymore and didn’t play the next year. Will being watched all the time reduce our joy?

The ACLU is against such cameras, and I read their report. Or at least I tried to read it. The nonsense flowed early and often. They claimed no data existed as to the efficacy of these systems but then went on to spout unsubstantiated, data-less drivel about how cameras affect the LGBTQ+ community more than others. I had to stop reading as it was making me stupider by the word. The fact that the ACLU is against these cameras is a valid reason to be for the cameras. A once ethical, reasonable interest group has de-evolved into woke insanity.

Do the pros outweigh the cons? Yes, they do, since the cons are largely the boogie man in the closet and the monster under the bed. The relevant question is then: Is the bang worth the buck and should the state pay for it? It’s been tested satisfactorily in private schools in Colorado Springs, but it might not be for every school and it doesn’t have to be. Let individual school systems decide for themselves and if they opt in, fund it out of their current budget. Let the people on the ground, closest to the problem, decide.

Bill Wright, bill@wwwright.com

------------

Cameras are ubiquitous across the American landscape. It’s been decades since I’ve been pulled over by a policeman and issued a ticket for a traffic infraction, but occasionally in the mail I receive a parking violation notice. I pay it and vow to myself to be more careful.

Now with Artificial Intelligence on the horizon, facial recognition technology has brought us one step closer to that world our teenagers read about in Orwell’s “1984.” Kids did not have to negotiate this world in 1999. At that time the political activist group, Student Worker at Boulder High School was threatened with academic probation and suspension if they were caught demonstrating against C-SAP on school grounds. Hundreds walked out. That same group staged a sit-in and refused to leave the library after George Bush was elected, fearing an uptick in military recruitment at school. They were ordered to leave the building. They would not until they could speak about their concerns with elected representatives. The next morning, Mark Udall complied. I was the faculty sponsor of Student Worker. I believed their “misbehavior” was an exercise of their First Amendment rights. There were no cameras, and no draconian measures were meted out to students.

A positive school culture is built on relationships, not on surveillance data. As a teacher, I knew the names of all 150 students I saw daily in the classroom. I sat with them at lunch, went to their football and volleyball games, high-fived them in the hallways, applauded them on stage and took their pictures at Prom. The majority of teachers did this. We did not need electronic eyes, we were the eyes. Some falsely believe that facial recognition technology is just another tech tool to ensure school safety. Watchful administrators can efficiently scan video feeds and spot those students who inflict property damage or worse. What’s wrong with that?

School climate is degraded when students are treated like inmates. Kids want to know that adults and administrators are watching out for them, not watching them, and if you really want to create school safety, establish more mental health protocols, support sensible gun laws, decrease student anxiety by limiting high-stakes testing, allow students to have an actual voice in the governance of their school through student-initiated clubs, peer counseling and restorative justice programs.

Hopefully, after Nov. 5 students will not have to take to the streets because no one wants a Trump administration retaliating against the “enemy from within” assisted by advanced facial recognition technology.

Jim Vacca, jamespvacca1@gmail.com

------------

At first glance, this AI approach to school safety might seem to offer a way to secure these facilities without making society-wide structural changes. If pursued, it would offer a possible hiatus in the endless debates about school shootings, the inadequacy of the mental health services available to young people, and, of course, gun control. However, this comforting respite would likely be only a temporary prelude to the even worse problems to come.

Let me present just a few of the reasons that the implementation of this technology is a dangerous idea. First, this AI technique clearly constitutes a threat to every student’s (and teacher’s) privacy. Consider the possibility of surveillance equipment in bathrooms. Second, this technology does not accurately identify people of color; this alone should be a deal-breaker. Third, the technology can be misused to target particular groups of students for reasons other than school safety. For example, it could be used to identify students who fit the stereotypes associated with immigrants without documentation. The possibility of ICE raids at Boulder High should be enough for our legislators to have serious doubts about this approach.

Fourth — and this should also be a deal-breaker — there is insufficient independent research on either the effectiveness of this “safety measure” or its potential negative consequences. As far as I can tell, the study conducted by the New York State Office of Technology Services that concluded that the risks of this AI approach outweighed its benefits has been challenged only by research sponsored by the very industry that stands to profit from it. In fact, New York State banned the use of AI technology in schools following this study. Fifth, the deployment of surveillance equipment, even if used “appropriately,” can have a chilling effect on a school environment. One can easily imagine the inhibitory impact of cameras on the willingness of students to put their critical thinking into words, which should be a main educational goal.

More generally, the availability of this technology poses a massive threat to us all in the current political climate. That it is even being seriously considered for immediate use should be viewed as a warning about several dangerous trends in contemporary society. For example, the unfortunate tendency to denigrate science and the scientific method seems to underlie the widespread willingness to implement this project without sufficient evidence. Indeed, data collected by stakeholders should not be considered equivalent to those collected by neutral parties using evidence-based methods. Additionally, despite rhetoric to the contrary, we seem to have acquiesced to the widespread availability of guns and the neglect of mental health needs. Moreover, the apparent eagerness of some officials to adopt this AI technology reflects a more general willingness to accept policies that confine themselves to the symptoms of social problems rather than dealing with the causes.

The threat posed by this technology takes on greater urgency as the election approaches. Indeed, it would be a great gift to those wishing to undermine our democracy. In this context, we must ensure that those in power can be trusted to use it ethically and in the public interest. It is at the nexus between political power and the capabilities of AI that the most immediate clear and present danger may lie.

Elyse Morgan, emorgan2975@gmail.com

©2024 the Daily Camera (Boulder, Colo.). Distributed by Tribune Content Agency, LLC.