IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

ACLU Calls for Safeguards in Maryland Facial Recognition Policy

The ACLU of Maryland is calling for safeguards to be incorporated in a statewide policy governing the use of facial recognition technology by law enforcement.

A blue digital image of a face.
(TNS) — The ACLU of Maryland is calling for safeguards to be incorporated in a statewide policy governing the use of facial recognition technology by law enforcement, saying that police use poses “significant risks” to the public.

The technology, a tool used to identify people through artificial intelligence, has led to false matches and wrongful arrests, and poses amplified risks for people of color and women, the rights organization warned in a letter sent last week. The letter further warned that use of the tool with video footage could enable “mass surveillance.”

The best solution to those risks is an outright ban, said Nate Freed Wessler, the deputy director of the ACLU’s Speech, Privacy and Technology Project, who co-authored the ACLU of Maryland’s letter. But in places like Maryland, where lawmakers are trying to regulate it rather than outlaw it, there are “real, serious steps” police can take to minimize risk.

“The Maryland legislature engaged in a good faith effort to address some of the harms of this technology,” Wessler said. “The state police have a chance now to become one of the leaders in the country on serious protections against abuse. … The ball is in their court.”

Earlier this year, state legislators enacted restrictions on law enforcement’s use of the technology, including stating that facial recognition technology results cannot serve as the sole basis for probable cause, barring its use for more minor offenses and requiring disclosure by the state during discovery in a criminal case if the technology was used.

The law requires that the Maryland State Police adopt and publish a model policy regarding facial recognition. That policy then would be the basis for whether local agencies could use the technology for criminal investigations.

Maryland State Police spokeswoman Elena Russo said in an email that the agency’s model policy would be completed by Oct 1.

“The Department is committed to ensuring the policy will reflect the values and expectation of our communities while protecting the constitutional rights of the citizens we serve and protect,” Russo’s statement said.

The ACLU of Maryland wrote in its letter Thursday to the state police that it hopes to see three “minimum protections” included in the state policy:

  • Clarification that a “lineup” generated using facial recognition technology may not be the independent evidence required for probable cause or positive identification;
  • A prohibition on the use of facial recognition for identifying or tracking people captured in recorded video footage; and
  • A prohibition on contracting with companies that use facial recognition databases with images collected without parties’ consent.

According to the ACLU, the state policy needs to go further in laying out what additional evidence is required to reach the probable cause standard because a false facial recognition match could “bias” subsequent identifications, such as someone reviewing a photographic lineup.

“When face recognition algorithms get it wrong, they still are spitting out images that look like the suspect. Doppelgangers, basically. Lookalikes,” Wessler said. “So now, you are presenting a witness with an image of someone who looks like the suspect, often a lot like the suspect, but isn’t actually the suspect. And unsurprisingly, people get it wrong.”

That was the case for Porcha Woodruff, a Detroit woman arrested for a carjacking despite being eight months pregnant, after a victim identified her photo in a lineup. Charges against her have since been dismissed and she sued last year. The city has since adopted new rules for how its police department can use the technology, as part of a settlement agreement reached in a separate lawsuit.

One of Detroit’s new regulations: Police cannot show images of someone identified through facial recognition technology unless there is other evidence linking them to the crime.

The ACLU of Maryland pointed to that policy in its letter, writing that it wants Maryland’s policy to specify that an arrest or arrest warrant must include “additional, independently obtained evidence” outside of a lineup and a lead from facial recognition technology. It further wants the policy to require a supervisor to check that there is an “independent basis” for suspecting someone was involved in a crime before a photographic line-up or other identification.

Wessler, who worked on the Detroit lawsuit that led to the city’s policy change, called the suggestion one of the most important things that police can do to ensure they don’t get led astray by false matches.

Additionally, the ACLU wants to see Maryland further restrict the use of facial recognition technology on recorded video. The state’s newly passed legislation bars the use of facial recognition technology for “live or real-time identification.” But the organization hopes to prevent video footage, including from police body cameras, from being analyzed, which it wrote could lean to “automated tracking or identification of individuals’ movements, activities or associations over time.”

Finally, it wants Maryland to ensure law enforcement agencies don’t contract with or purchase facial recognition technologies that collected images in violation of federal or state law, or without consent.

State law allows police to use the state driver’s license database, law enforcement mugshot databases and other matching databases, so long as the law enforcement agencies’ agreements with those databases include provisions “governing the methods by which the images in the database are collected.”

Maryland already has seen at least one wrongful arrest attributed in part to facial recognition technology, which was first reported by Wired and The New Yorker. Baltimore County State’s Attorney Scott Shellenberger said last year that facial recognition technology was used to identify a suspect in an assault. A photo of one of the potential matches was shown to that man’s probation officer, who misidentified him as the suspect. He was taken into custody until his wife convinced police he was not the perpetrator.

Shellenberger also helped develop the legislation in a working group where people hashed out their differences and reached a compromise. On Monday, he called the legislation a “comprehensive statute” that strikes a balance between the needs of law enforcement and the concerns of people worried about privacy. He also noted that there are checks on its use, and a requirement that it can’t serve as the only evidence in a case.

For police, he said, the tool represents a modern way of identifying people suspected of dangerous crimes.

“Twenty or thirty years ago, you sat down at a police station and looked at a mugshot book. Now, we have a computer that can do that — and look at driver’s licenses and not be limited to the state of Maryland.”

Shellenberger, who said he hadn’t seen the ACLU’s letter, argued further changes shouldn’t be necessary since everyone worked closely on the final product.

“I don’t necessarily think we have to do any add-ons at this point,” he said. “All the limitations are already in the statute. There’s not a need to add to that. [The model policy] should follow the letter of the law.”

Experts in privacy, surveillance and technology policy praised the safeguards suggested by the ACLU of Maryland.

Jake Laperruque, deputy director of the Security and Surveillance Project at the Center for Democracy and Technology in Washington, D.C., said the group’s suggestions would “promote responsible use and better protect public safety.” He called facial recognition a double-edged sword — dangerous when it makes mistakes, which can lead to wrongful arrests, and dangerous when it accurately identifies individuals and is used to track and retaliate against people.

Laperruque suggested a further step could be limiting police use of facial recognition unless they have a warrant, as other states have required.

Jeramie Scott, the director of the project on surveillance oversight at the Electronic Privacy Information Center, or EPIC, said the ACLU’s points appear designed to reduce the chances of “worst-case scenarios.”

Scott said he would advocate for a full ban, but that to the extent the technology is used, such safeguards are important. He called, too, for “extensive training” for the people who analyze the results of a search.

© 2024 Baltimore Sun. Distributed by Tribune Content Agency, LLC.