Commissioners voted 5-0 after an hour of public discussion with the police chief addressing the accuracy of the technology and privacy concerns.
Police used a free trial of the software earlier this year and recently updated its policy to add more restrictions, which include limits on access to the software to command staff, only for designated law enforcement reasons, and requiring regular sample audits to ensure the policy is followed.
Police Chief Stuart VanHoozer and other police officials said they hope to mitigate the community’s concerns through their updated policy and additional outreach efforts.
“I don’t want to live in a police state,” VanHoozer said. “We’re not going to get in a situation where we think that people’s civil liberties should be trampled upon so we can solve every case that we have.”
Face recognition uses artificial intelligence to identify an individual based on unique facial features.
With Clearview AI, police can enter an image of a face into the software, which then combs through billions of publicly available images on the internet and social media networks to find potential matches. Then, the officer conducts an evaluation to determine which face is a likely match, and traces it back to the web source to find the individual’s name.
“This is not scraping images that you’ve posted simply on social media,” said Darin Hull, captain of the real-time crime center and the crime analysis unit. “The only way we have access to any social media images is if they choose to share their images publicly.”
The department used Clearview AI in a seven-month trial that began in January. The chief originally submitted the contract request in June, but Chairwoman Lisa Cupid, along with members of the Cobb Coalition for Public Safety, delayed the vote until questions about the technology could be answered.
One of the main concerns expressed at Tuesday’s board meeting was accuracy. VanHoozer said a report from the National Institute of Standards and Technology rates Clearview AI’s technology at above 99% accurate for all races.
The police chief said he is strongly against surveillance of the general public and said the software will not be used to evaluate video footage or surveil the general public, which is also included in the updated policy.
“It is not where we put a face into a computer, and that computer tells us, ‘Here’s your bad guy,’ and we go get a warrant. Nothing even close to that,” VanHoozer said.
Michael McNeely, with the Cobb Coalition of Public Safety, said he believes the technology will help police solve more crimes and protect victims. He also emphasized the need for “evaluation of how the technology is being implemented to address any trends that may be taking the department in a direction that would not be in the best interest of the community,” he said.
Fallon McClure, the ACLU of Georgia’s deputy director of policy and advocacy, said she thinks the department is on the right track with its policy.
“We would rather them just not use these technologies at all,” McClure said. ”But if they are going to use them, we want these guardrails, and actually would like them to exhaust other investigative options first.”
The updated policy is still in draft form and has not yet been officially approved. It allows five to 10 people in the department access to the software and database, said police Lt. David Thorp, who oversees the day-to-day operations of the crime analysis unit and the real-time crime center.
“If we can keep it in a tight-knit group, it’s much easier to monitor and audit,” he added.
When detectives need to identify a suspect or victim and have an image, they can submit a request to the face recognition team. They must include the case number to ensure all searches involve official cases, and they also must cite a “legitimate law enforcement reason.”
An officer trained in the software then enters the image, receives the potential matches, and conducts a manual evaluation to determine which face could be a match. A second search is conducted by a different officer in a blind review to determine if there is a match.
If both officers have the same match, the lead can be provided to the detective, who can then use it to find additional evidence and continue with the case.
“We don’t solve a case by using facial recognition. What it does is it provides a lead,” Thorp said. “The facial identification aspect is a starting point.”
After enough evidence has been compiled, the case undergoes a special supervisory review for cases that use face recognition technology before officers seek an arrest warrant.
ACLU Attorney Nate Freed Wessler said in an interview earlier this year that the software should be “highly constrained to only the most serious types of crimes, only with a search warrant issued by a judge based on probable cause.”
But the chief said he chose not to put that restriction on his officers because all crime victims deserve justice.
“I’ve never met a rape victim who said, ‘I want you to catch him, as long as you don’t have to look at his Twitter page,’” VanHoozer said.
© 2022 The Atlanta Journal-Constitution. Distributed by Tribune Content Agency, LLC.