Amid global protests sparked by the police killings of George Floyd, Breonna Taylor and other Black people, companies have come under increasing scrutiny for their sale of technologies to law enforcement agencies, with a particular focus on facial recognition, which has been shown to misidentify people of color more frequently than white people and used to identify protesters.
“It took two years for Amazon to get to this point, but we’re glad the company is finally recognizing the dangers face recognition poses to Black and Brown communities and civil rights more broadly,” said Nicole Ozer of the American Civil Liberties Union (ACLU) of Northern California in a statement.
Amazon’s decision followed IBM’s move earlier this week to stop selling its general purpose facial-recognition and analysis software, and to discontinue research on it. Google, too, stepped back from the technology, pending additional regulation. Microsoft, which also has sold the technology to police and has advocated for government regulation, did not return a request for comment on Wednesday.
Facial-recognition technology attempts to match facial features of an individual in a digital image, such as a surveillance video, to those stored in a database, which could be populated with police booking photos or social media posts, to establish identity.
ACLU researchers in 2018 discovered that Amazon was selling its version of the technology, known as Rekognition, to police, though law enforcement agencies have had similar technologies at their disposal for much longer. Last week, as Amazon and its executives expressed support for Black Lives Matter, the ACLU tweeted, “Will you commit to stop selling face recognition surveillance technology that supercharges police abuse?”
Researchers at the Massachusetts Institute of Technology have demonstrated disparate error rates of facial recognition systems, including Rekognition. Amazon asserted the researchers did not use its service properly. A federal government research laboratory also found facial-recognition systems tend to misidentify Black women more frequently than white men, among many other “demographic differentials.” Native Americans were most frequently misidentified, the researchers at the National Institute of Standards and Technology reported late last year. Amazon did not submit its technology for testing.
Facial-recognition technology was used by Baltimore police in 2015 to identify protesters after the death of Freddie Gray in police custody.
Amazon said Rekognition will continue to be made available to organizations that use it “to help rescue human-trafficking victims and reunite missing children with their families.” The Seattle-based company has said more than 100 children have been reunited with their families with help from its technology since its introduction in 2016.
Amazon made no statement about home-surveillance cameras and other technologies sold through its Ring subsidiary, which works closely with police departments, and have also faced criticism.
“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” the company said in a blog post Wednesday afternoon. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”
An Amazon Web Services spokesperson did not respond to questions about how many police departments use Rekognition, and whether the one-year pause on police use of it includes other government agencies such as Immigration and Customs Enforcement – a target of Amazon employee protests.
Ozer said the ACLU wants Amazon to go further, noting that threats posed to civil rights “will not disappear in a year.
“Amazon must fully commit to a blanket moratorium on law enforcement use of face recognition until the dangers can be fully addressed, and it must press Congress and legislatures across the country to do the same,” she said. “They should also commit to stop selling surveillance systems like Ring that fuel the over-policing of communities of color.”
Cities including Oakland and San Francisco have banned the use of facial-recognition technology by governments; Boston is considering such a ban. Earlier this year, Washington state passed a law, which takes effect next July, requiring training on and testing of facial recognition technology used by government agencies, and placing some limitations on what they can do with it. For example, law enforcement agencies can’t use the technology as the sole basis to establish probable cause in a criminal investigation.
Last month, Amazon shareholders voted down a proposal seeking a report “to determine whether customers’ use of its surveillance and computer vision products or cloud-based services contributes to human rights violations.”
The shareholder proposal said the use of Amazon’s technologies in law enforcement and immigration contexts “that have existing systemic inequities may replicate, exacerbate, and mask these inequities. It may also compromise public oversight and contribute to widespread government surveillance.”
The company’s board of directors opposed the resolution, noting that all technologies have the potential to be misused, and that potential “should not prevent us from making that technology available.”
In early 2019, Amazon published guidelines for lawmakers seeking to regulate the technology, including recommendations that law enforcement set the technology to return matches at the 99% confidence threshold; that agencies be transparent in how they use the technology; and that notice be given when the technology is used in public or business settings.
The company’s board said its proposed national legislative framework would “help protect individual civil rights and ensure that customers are transparent in their application of the technology.”
The Washington County Sheriff’s Office in Oregon was one of the first users of Rekognition and is one of dozens of customers from a variety of industries featured by Amazon. The law enforcement agency serving suburban Portland reportedly paid Amazon about $700 to upload a database of photos and about $7 a month to conduct searches.
A spokesman for the office said Wednesday afternoon the agency had just learned of Amazon’s decision. “During the moratorium, we’re going to suspend its use,” Sgt. Danny DiPietro said.
The Seattle Police Department stopped using facial-recognition technology sometime in 2018, a spokesman for the department told The Seattle Times last year.
©2020 The Seattle Times. Distributed by Tribune Content Agency, LLC.