Against this backdrop, policymakers must decide how to handle competing perspectives and priorities, and determine where to draw the lines when deciding if, when and how police may use the tool.
Today’s cities and states are split over police FRT. By March 2023, Vermont and 21 local governments in other states had at some point passed bans preventing law enforcement use, per digital rights advocacy group Fight for the Future. (At least one city later eased its ban.) Fight for the Future also charts 16 states where at least one state police agency is using FRT, and many more municipalities where local police do the same.
This isn’t always an all-or-nothing decision, or a fully settled one. California, New Orleans and Virginia had fully or partially banned police use of FRT, then let those prohibitions expire or replaced them with more permissive policies.
What happened in these jurisdictions, and what lessons do they offer about regulating emerging technologies?
New Orleans
Virginia
A 2021 Virginia law blocked local and campus police from directly using FRT. Lashrecse Aird, delegate from the state’s 63rd district, was reportedly inspired to introduce the bill leading to this law after news broke that Norfolk, Va., police had been using FRT without the knowledge of most city councilors or the mayor.
Sen. Ryan McDougle strengthened Aird’s bill via amendments. He told GovTech he worried about inaccurate matches and hoped a ban would prevent harms while giving policymakers time to create a more nuanced, comprehensive approach.
Later, Sen. Scott Surovell sponsored a 2022 law that replaced that partial ban. He told GovTech new rules were needed because McDougle’s amendments weren’t sufficiently discussed and vetted before passing. Some local police accustomed to FRT were surprised by McDougle’s stronger limits, and a loophole let state police still use the tech on behalf of local departments.
The 2022 law let police use FRT for specific use cases and with certain safeguards. What safeguards should look like, however, is an ongoing discussion, with McDougle and others saying more guardrails are needed.
California
California Assemblymember Phil Ting feared FRT could enable constant mass surveillance of people going about their lives and that accuracy issues could lead to wrongful arrests. He co-sponsored a 2019 statewide restriction blocking police use, implementation or activation of “any biometric surveillance system in connection with” an officer’s camera or the data collected by such a camera. Rather than a full ban, the policy was a three-year moratorium.
“Given the interest in the technology by a number of my colleagues, we really couldn’t get anything more than three years,” Ting told GovTech.
A bid to renew the ban failed by two votes. The ban’s expiration meant “[now] we actually have no safeguards,” Ting said in March 2023. “Body cameras and facial recognition software can be used for any purpose. There are no regulations.”
Assemblymember Lori Wilson sponsored a bill to re-ban using biometric surveillance with body cameras. Ting, meanwhile, believes a moratorium or ban would struggle to clear the Legislature and instead sponsored a bill permitting police use of FRT under certain restrictions.
Common Safeguards
New Orleans’ and Virginia’s new laws, and Ting’s proposed bill, share some commonalities such as barring police from using FRT as the sole evidence to justify an arrest, enabling FRT only for specific purposes (such as investigating certain crimes) and requiring officers to document use of the tool.
The Information Technology and Innovation Foundation (ITIF) recommends regulating police data collection and being transparent about what data is gathered and why, how it’s protected and how long it’s stored. Officers should also be trained on new technologies before engaging them and clearly document the tools’ uses.
ITIF, which advocates for safeguards over bans, also says that police technologies deemed unfit for one police department at a certain point in time could be beneficial to other departments or worth revisiting later. Policies permitting use also warrant reconsiderations — Virginia will analyze the effectiveness, and any discriminatory impacts, of its FRT use and re-examine its law in 2025, Surovell said.
RISKS
Misidentification
FRT has often performed worse when attempting to match nonwhite people.
Between 2020 and 2022, five Black men were falsely arrested after facial recognition misidentified them, Electronic Frontier Foundation (EFF) Senior Staff Attorney Adam Schwartz told GovTech. And in 2021, a Black girl was kicked out of a roller rink after the technology mistook her for someone who’d been banned for fighting.
Police may also be more likely to draw weapons on someone that FRT mistakenly identifies as a violent criminal, escalating tensions and raising the “risk of police violence from which people don’t come back,” said Carmen-Nicole Cox, director of government affairs at the ACLU of California, during recent testimony.
Privacy Invasion
Detractors aren’t only concerned about faulty FRT matches. Some also fear highly accurate FRT. Unless regulations forbid it, police might use FRT to track residents whenever they’re in public. Mass tracking could be abused to chill free speech and assembly, for example.
Some also worry such mass surveillance could lead to police pinpointing and pursuing people based on the anticipation that they might commit a crime. New Orleans acknowledged such predictive policing concerns with a policy limiting FRT use to investigating crimes that have already happened.
Mission Creep?
New Orleans permits using FRT to investigate any of 47 “significant” crimes, including murder, rape, “disarming of a peace officer” and purse snatching.
City councilors Lesli Harris and JP Morrell opposed lifting the ban and worried the city’s definition of significant crimes could change over time, leading FRT to be applied more broadly and invasively.
Schwartz said FRT’s potentially beneficial uses are dwarfed by the dangers the tech poses. Enabling any police agency access to FRT opens the door to its increasingly expansive use, Schwartz said, pointing to past precedent: “DNA was initially only being taken from people convicted of heinous crimes, but now it’s being taken from people arrested for misdemeanors.”
New Orleans Police Department (NOPD) Public Information Officer Karen Boudrie told GovTech the department does hope to grow its FRT use, should the technology become more thoroughly tested and prove helpful.
Banning technologies outright is not the right approach. [but] there could potentially be certain use cases that certain jurisdictions might not want [or] might want to heavily restrict or create stronger rules for.
“In the future, once the technology has been better vetted and meets the standards of the NOPD, the use of other sources outside of the Louisiana State Analytical and Fusion Exchange would be beneficial,” Boudrie said. “Eventually, once the program has proven to be beneficial, expanding the authorized use of the technology to other major crimes (such as identity theft) would also be beneficial.”
Misuse and Abuse
Those wary of FRT also fear officers might misuse the tools, deliberately or accidentally. The Voice of San Diego reports that the city’s police had used FRT up to 2.5 times more often on people who are Black, per 2019 researcher testimony.
BENEFITS
Apprehending Suspects
When violent crime rose in New Orleans, Councilmember Eugene Green and Mayor Cantrell pushed to lift the city’s FRT ban. Neither were available for comment, but Cantrell reportedly called FRT a “valuable, force-multiplying tool that will help take dangerous criminals off our streets” and called the ban lift “a tremendous stride towards greater public safety.”
NOLA policy sees the Louisiana State Analytic and Fusion Exchange (LA-SAFE) run FRT checks on behalf of city police. Police may not use FRT matches to justify an arrest or warrant, however, only to find leads.
The tool “provides a lead for investigators to further their investigation, helping to mitigate the possibility of investigations going ‘cold,’” Boudrie said.
Since 2022, NOPD received approvals on 10 requests to use the service, with LA-SAFE returning potential matches in four cases. In two, police were able to identify the subject through “further investigation or a completely separate investigation of a separate, but related crime” and obtain an arrest or warrant for the subject, Boudrie said. The other eight cases remain under investigation.
The technology has also seen other notable uses, including helping the FBI investigate and ultimately arrest an MS-13 gang member in 2017.
Finding Victims, Clearing Charges
Police FRT use isn’t only about seeking suspects.
Virginia’s law lets police use FRT to identify victims — including those seen in online sexual abuse material — as well as missing persons, dead bodies, and people who have a mental or physical condition that renders them unable to identify themselves, for example. Matches can also be used as exculpatory evidence and to find witnesses.
BEGINNING WITH A BAN
Groups like ITIF say emerging tech bans are rarely necessary, because restrictions can mitigate concerns. Others like EFF say FRT is too harmful for anything short of a ban.
Somerville, Mass., became the second city to ban FRT. This was a pre-emptive move, inspired by seeing the technology spreading among municipalities with little public process, while researchers were reporting about the tool’s biases, bill sponsor and Councilmember Ben Ewen-Campen told GovTech. For Ewen-Campen, it was important to quickly block use, rather than wait for councilors to craft more nuanced policies.
For other jurisdictions, full or partial bans were a starting point, intended to pause police use and give lawmakers time to understand the tech and the rules needed. Virginia Del. James Leftwich referred to the state’s 2021 restrictions as a “timeout.”
(WHEN) SHOULD BANS LIFT?
Not everyone sees bans as a long-term fix.
Surovell said many technologies that are now accepted as commonplace faced initial public backlash. He crafted parts of Virginia’s 2022 law by looking to earlier policies around lidar and radar procurement.
“I think a lot of the discomfort for this issue comes from people’s discomfort with new technologies,” Surovell said. “I suspect the kinds of reactions people are having, it’s probably the same thing people felt when radar was first deployed to measure the speed of cars … ‘How could some beam tell you what the speed of my car is?’ and ‘That’s an invasion of my privacy,’ and ‘Government shouldn’t know how fast my car is traveling.’”
Opinions vary over when policymakers can say an emerging technology is well enough understood to allow its use.
New Orleans considered requiring police to secure a judge’s approval before using FRT. But judges rejected the measure, saying they didn’t sufficiently understand the technology.
Policymakers also have a role to play in determining when a technology is sufficiently mature.
Surovell said many misidentification-based arrests involved “older versions of the technology” and policing errors. Virginia looks to curtail false matches by limiting police to using tools that the National Institute of Standards and Technology (NIST) gives a 98 percent or higher accuracy rate across races and genders.
However, others say this research fails to capture how law enforcement actually uses the tools in the real world.
NIST’s accuracy ratings only assess the tools’ abilities to compare high-quality, well-lit photographs, rather than the more haphazardly shot ones officers are likely to glean from social media or surveillance cameras, according to a 2023 Richmond Public Interest Law Review article, from authors with public defense and privacy backgrounds. For example, when an algorithm was used to match a mugshot against a database of 12 million mugshots it showed a 0.18 false positive rate; but the same algorithm yielded a 7.1 percent false positive rate when comparing lower-quality surveillance or ATM camera-style photos against a smaller database of high-quality photos. California, similarly, had homed in on FRT use with body camera footage, which can be “blurry, skewed and in near-constant motion,” per the bill to renew this ban.
Plus, humans play a role in selecting the photos to submit to the FRT system and in evaluating the potential matches it produces — both areas prone to human error, per the article.
NOPD, however, says safeguards can protect against this: “Officers are trained to use high-quality, forward-facing photos to ensure the best possibility of a valid match,” and are trained about “potential dangers associated” with FRT use that might impact investigations.
Do Rules Against Misuse Work?
Restrictions — like those preventing making arrests based on FRT identifications — aim to help prevent misuse and mistakes.
The NOPD says it is aware of one false-positive match since it began using FRT, and that safeguards prevented a harmful result.
“Because of the safeguards in place, investigators were able to recognize the misidentification and exonerate the identified individual prior to any law enforcement action being taken,” Boudrie said.
When you look at policies, try to make sure that you are putting in protections for privacy and individual rights first, and then the use [by] law enforcement second.
Of course, that may not undo the impact of the violations.
Schwartz said that while the right regulations could safeguard police use of surveillance technologies like drones, FRT’s potential harms are too great to risk: “[With] any set of rules, there’s a danger that police are going to break the rule or find clever ways to evade the rule.”
BIG PICTURE: BALANCING VALUES
ITIF Senior Policy Analyst Ashley Johnson believes technology bans are rarely necessary. Instead, she says use-case-based restrictions and data privacy and security rules are usually enough to safeguard even sensitive technologies’ use.
“Banning technologies outright is not the right approach,” Johnson said. “[But] there could potentially be certain use cases that certain jurisdictions might not want [or] might want to heavily restrict or create stronger rules for.”
With policymaking on FRT, “what we’re trying to solve here is the balance between privacy and public safety,” Johnson told GovTech.
She recommends looking to how governments have resolved that debate for other police technologies. For example, “we collect DNA from criminals or suspects under certain conditions. We’re not just collecting DNA from random people out on the street …. There are processes that they go through and rules for how to use that evidence.”
Sen. McDougle said that when balancing priorities, the scales should tip in favor of privacy.
“When you look at policies, try to make sure that you are putting in protections for privacy and individual rights first, and then the use [by] law enforcement second. “Could law enforcement listen to everybody’s phone conversations and read everybody’s text messages, and would they find some sort of criminal activity? Likely. But we feel like those conversations are private and should be protected,” McDougle said.
Acknowledging some “legitimate reasons” for police FRT use, McDougle said officers should first have to obtain search warrants. That would require them to articulate reasonable suspicion before a judge justifying the tool’s use — similar to existing processes for collecting DNA and fingerprints.
FUTURE-PROOFED FRAMEWORK
Johnson recommends policymakers prepare for unexpected technology developments by establishing data security and privacy policies with safeguards that would be broadly applicable across technologies, including those emerging in the future. This might enable governments to safely permit the tools’ use even while they craft any additional, specific restrictions. Such frameworks could also keep policies relevant as technologies currently being regulated change and evolve.
Many cybersecurity best practices and privacy regulations already exist that could guide such efforts, Johnson said.
“Having set-in-stone regulations or policies ahead of time on exactly what data is allowed to be collected, under what circumstances, how long that data can be retained if it’s not related to an ongoing investigation, what the cybersecurity practices should look like in order to prevent that data from falling into the wrong hands — I think those are various different things that governments and police departments should be taking into account from the get-go,” Johnson said.
Schwartz, however, says that while such a framework or other limitations could help safeguard use of some emerging technologies, FRT is uniquely dangerous: “There’s not a set of protocols and safeguards that can eliminate the danger to the public,” he said.
TRANSPARENCY AND ENGAGEMENT
Transparency and community engagement are essential for government agencies considering emerging technologies.
What’s important is that these conversations are happening in public and that elected officials and members of the public can weigh in on them.
“The decision of whether to roll out a surveillance technology is one that should not be made by the deputy police chief or assistant director of the Department of Transportation with the vendors,” but rather through a process that involves the public and their elected officials, Schwartz said.
Somerville, Mass., has since adopted a surveillance technology oversight ordinance. Under it, police seeking to acquire or use surveillance technology must provide a written explanation to the City Council, and the tools are then discussed at public meetings with expert testimony and opportunities for public comment. After, the council votes on whether to approve. This process has seen the city approve a school’s request to install a gunshot detection system, for example.
“What’s important is that these conversations are happening in public and that elected officials and members of the public can weigh in on them,” Ewen-Campen said.
This story originally appeared in the June issue ofGovernment Technologymagazine. Click here to view the full digital edition online.