A news release from Axon, formerly known as Taser, said the decision was based on a 42-page report from the Axon AI and Policing Technology Ethics Board, an independent body of 11 experts in artificial intelligence, computer science, privacy, law enforcement, civil liberties and public policy which advises the company on ethical issues.
Addressing a potential source of confusion, the report delineates between three types of face-recognition technology applied to photos or videos: face matching, which identifies a person by matching their face to a photo in a database; face detection, which identifies the presence of a face in an image; and face re-identification, which identifies every time the same face reappears in a video.
The board cited “serious concerns” about face-recognition technology in general, repeatedly stressing that, at present, the algorithms it uses are neither accurate nor equitable enough to be harmless. The board cited problems of false identification and racial disparities in how accurate it is, but even if that improves, it said the technology could record where people are, where their kids go to school, what groceries they buy or what meetings they attend.
This could invite government surveillance and intrusion into people’s lives, or worse under foreign governments not bound by U.S. law.
According to the report, Axon has had different levels of interest in face matching, detection and re-identification. The company acquired a couple of AI startups in 2017 that had done work on identifying objects in videos. Axon told the board that it was not working on any face-matching products but keeping a close eye on the latest research.
As anyone on social media knows, basic face detection has been widely commercialized for years. And combining face detection with re-identification capabilities, Axon is developing products to help with the redaction of camera footage by blurring out all recurring faces in a video so it can be made public. The board endorsed that.
In general, the board recommended caution and more public discussion wherever public agencies are considering face-recognition technology. For Axon, it offered a handful of key conclusions:
- Face-recognition technology isn’t reliable enough for use in body cameras.
- Face-recognition technology should not be very customizable by users, because it could be more easily and severely misused.
- No one should adopt face-recognition technology without public input.
- Anyone developing new face-recognition products should have evidence of its benefits.
- Any discussion of benefits and use cases for face-recognition technology should take into account “the realities of policing in America” and existing technological limitations.