Face recognition CEO says tech not ready for policing
26 June 2018 11:46 GMT

Face recognition will be used to harm citizens if given to governments or police, writes Brian Brackeen, CEO of the face recognition and AI startup Kairos, in an op-ed published by TechCrunch today. Last week, news broke that bodycam maker Axon requested a partnership with Kairos to explore face recognition. Brackeen declined, and writes today that “using commercial facial recognition in law enforcement is irresponsible and dangerous.”

“As the Black chief executive of a software company developing facial recognition services, I have a personal connection to the technology both culturally, and socially,” Brackeen writes. Face recognition is one of the most contentious areas in privacy and surveillance studies, because of issues of both privacy and race. A study by MIT computer scientist Joy Buolamwini published earlier this year found face recognition is routinely less accurate on darker-skinned faces than it is on lighter-skinned faces. A serious problem, Brackeen reasons, is that as law enforcement relies more and more on face recognition, the racial disparity in accuracy will lead to consequences for people of color. 

“The more images of people of color it sees, the more likely it is to properly identify them,” he writes. “The problem is, existing software has not been exposed to enough images of people of color to be confidently relied upon to identify them. And misidentification could lead to wrongful conviction, or far worse.”

Law enforcement agencies have increasingly relied on face recognition in the U.S., celebrating the tech as a public safety service. Just last week, Amazon employees rallied against the use Rekognition, the company’s face recognition technology, by police. Once optional for U.S. citizens, the Orlando Airport now mandates face scans for all international travelers. And CBP has moved to institute face recognition at the Mexican border. In areas where identifying yourself is tied to physical safety, any inaccuracies or anomalies could lead to secondary searches and more interactions with law enforcement. If non-white faces are already more heavily scrutinized in high security spaces, face recognition could only add to that. 

“Any company in this space that willingly hands this software over to a government, be it America or another nation’s, is willfully endangering people’s lives,” concludes Brackeen. “We need movement from the top of every single company in this space to put a stop to these kinds of sales.”

Industry Events



Smart Security Week 24-26 Sep 18


BIOSIG 2018 26-28 Sep 18
ADAS 2018 Event supported by Planet Biometrics 26-28 Sep 18