IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

FTC Blocks Retailer from Using Facial Recognition in Stores

Federal authorities said Tuesday that Rite Aid will be banned from using facial recognition software for the next five years to settle claims that its anti-shoplifting technology unfairly and illegally tagged women and people of color.

A security guard at a Rite Aid in Manhattan opens the automatic door manually to limit the number of patrons in the store during COVID-19 protocols.
A security guard at a Rite Aid in Manhattan opens the automatic door manually to limit the number of patrons in the store during COVID-19 protocols.
Shutterstock/Joe Tabacca
(TNS) — Federal authorities said Tuesday that pharmacy chain Rite Aid will be banned from using AI-powered facial recognition software for the next five years to settle claims that its technology — used to prevent shoplifting in stores — unfairly and illegally tagged women and people of color.

Most of the stores with the facial recognition systems installed between 2012 and 2020 were located in the San Francisco, Los Angeles and Sacramento areas, according to a Federal Trade Commission complaint, along with other east and west coast cities.

There are more than a dozen Rite Aid locations in Alameda, Contra Costa, Marin and San Mateo counties, according to Google Maps, although none currently in San Francisco.

The chain filed for bankruptcy earlier this year, closing 31 stores across California. Rite Aid said in court filings earlier this year it planned to shutter 154 of its roughly 2,000 stores nationwide, with most of the closures coming in California, New York and Pennsylvania.

The company allegedly used the technology from two unnamed vendors to identify customers who may have shoplifted from the store in the past or engaged in other problematic behavior.

The FTC said that instead, non-white customers were routinely flagged by the systems, which matched photos and security camera footage to a database of often grainy images linked with names, dates of birth, and other personal information the company maintained.

"As a result of Rite Aid's failures, Black, Asian, Latino, and women consumers were especially likely to be harmed by Rite Aid's use of facial recognition technology," the agency said.

According to the FTC's statement, the technology performed so poorly that " Rite Aid's facial recognition technology told employees that just one pictured person had entered more than 130 Rite Aid locations from coast to coast more than 900 times in less than a week."

When a person entering the store supposedly matched an image in Rite Aid's database, an employee would get an alert on their company cell phone and would ask the person to leave or call the authorities, federal regulators said.

Rite Aid largely denied the allegations, despite agreeing to stop using the technology in the immediate future.

"We respect the FTC's inquiry and are aligned with the agency's mission to protect consumer privacy. However, we fundamentally disagree with the facial recognition allegations in the agency's complaint," the company said in a statement posted on its website.

The company said the allegations relate to a facial recognition pilot program it deployed in only a limited number of stores, and which it stopped using more than three years prior to the federal investigation.

A report from nonprofit Fight for the Future found that retailers including Home Depot, Macy's, and Albertson's also use some form of facial recognition technology, mostly in an effort to control shoplifting.

Facial recognition systems have long struggled to correctly identify people of color or have returned false matches, frequently failing to differentiate among non-white people.

An ACLU letter protesting the decision to use a form of the technology to proctor the state bar exam during the pandemic pointed to its many supposed flaws.

"Facial recognition has been repeatedly demonstrated to be less accurate when used to identify Black people, people of Asian descent, and women," the civil rights group wrote at the time.

The FTC said that Rite Aid ignored how false positives might negatively affect its customers, and failed to test the system for accuracy, uploading low-quality images and neglecting to train its staff on how to properly use the technology.

The company also did not tell customers that it was using the technology in its stores — which New York City now requires businesses to do — and discouraged employees from informing customers about it, the FTC said.

The identifies of the two vendors Rite Aid contracted with for its facial recognition system were not disclosed in the FTC finding, but numerous firms in the Bay Area and elsewhere not named by the FTC make similar technology.

Companies including HyperVerge in Palo Alto and Sensory in Santa Clara offer tools that can recognize people based on their unique facial characteristics.

©2023 the San Francisco Chronicle, Distributed by Tribune Content Agency, LLC.
Sign up for GovTech Today

Delivered daily to your inbox to stay on top of the latest state & local government technology trends.