IBM Stops Facial Recognition Unit Over Bias And Inequality

Wikimedia Commons, Rosemaryetoufee
Please Share This Story!
Whatever IBM’s motive over dumping its facial recognition business, it makes a bold statement to the entire industry, including Amazon and Google, that this technology has no place in law enforcement in the first place. ‚ĀÉ TN Editor

IBM CEO Arvind Krishna announced today that the company would no longer sell facial recognition services, calling for a ‚Äúnational dialogue‚ÄĚ on whether it should be used at all. He also voiced support for a new bill aiming to reduce police violence and increase accountability.

In a letter reported by CNBC, written in support of the Justice in Policing Act introduced today, Krishna explains the company’s exit from the controversial business of facial identification as a service.

IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency. We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.

This careful approach to developing and deploying the technology is not a new one: IBM last year emphasized it with a new database of face data that was more diverse than anything available at the time. After all, like any program, these systems are only as good as the information you feed into them.

However, facial recognition does not seem to have been making the company much money, if any. To be fair the technology is really in its infancy and there are few applications where an enterprise vendor like IBM makes sense. Amazon’s controversial Rekognition service, while it has been tested by quite a few law enforcement entities, is not well thought of in the field. It would not benefit IBM much to attempt to compete with a product that is similarly just barely good enough to use.

Krishna‚Äôs letter also says that ‚Äúvendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularly when used in law enforcement, and that such bias testing is audited and reported.‚ÄĚ This is something of a parting shot to those in the field, Amazon in particular, that have been called out for the poor quality of facial recognition systems but have not ceased to market them.

It’s unclear whether or how the company will continue to perform AI research along these lines.

Read full story here…

Notify of

Inline Feedbacks
View all comments