The House Homeland Security Committee will hear testimony from the National Institute of Standards and Technology, whose report recently found that, depending on the algorithm used, facial recognition software can lead to "false positives," particularly among Native Americans, Asians and African Americans, including women of color, meaning that the software wrongly considered photos of two different people to show the same person.
"The Department of Homeland Security's increasing use of facial recognition technology is quickly turning U.S. airports into surveillance hubs," said Sen. Edward Markey, D-Mass. "I'm working on legislation to force DHS to put a moratorium on its use of this technology until it enacts enforceable rules governing biometric data collection because the risks to our civil rights are just too high."
DHS referred requests for comment to U.S. Customs and Border Protection, which did not respond. But State Police spokesman David Procopio said that facial recognition can be an important tool for law enforcement.
"It is applied to images of a suspect obtained through prior investigative steps," Procopio said, "and any potential suspect identifications made with the technology are subsequently confirmed or rejected by other investigative steps."
But technology consultant Kate O'Neill, founder of KO Insights, said that society has "neither the regulations nor the infrastructure to require that people first opt in to facial recognition, let alone the public education necessary to allow people to make informed choices about the trade-offs of security versus privacy that would go into a wide-scale deployment."
"It's easy for law enforcement and government agencies to claim that tools like facial recognition make people safer," O'Neill said. "But that's not true if the technology misidentifies people...What's more, it is a clear expansion of a police surveillance state, which inherently limits civil liberties."
©2020 the Boston Herald. Distributed by Tribune Content Agency, LLC.