An unusual consensus emerged recently between artificial intelligence researchers, activists, lawmakers and many of the largest technology companies: Facial recognition software breeds bias, risks fuelling mass surveillance and should be regulated. Deciding on effective controls and acting on them will be a lot harder.
The Algorithmic Justice League and the Center of Privacy & Technology at Georgetown University Law Center unveiled the Safe Face Pledge, which asks companies not to provide facial AI for autonomous weapons or sell to law enforcement unless explicit laws are debated and passed to allow it.
Microsoft Corp. said the software carries significant risks and proposed rules to combat the threat. Research group AI Now, which includes AI researchers from Google and other companies, issued a similar call.
“Principles are great — they are starting points. Beyond the principles we need to be able to see actions,” said Joy Buolamwini, founder of the Algorithmic Justice League. None of the biggest makers of the software — companies like Microsoft, Google, Amazon.com Inc., Facebook Inc. and IBM — has signed the Safe Face Pledge yet.
Large tech companies may be reluctant to commit to a pledge like this, even if they’re concerned about negative consequences of the software. That’s because it could mean walking away from lucrative contracts for the emerging technology. The market for video surveillance gear is worth $18.5 billion a year, and AI-powered equipment for new forms of video analysis is an important emerging category, according to researcher IHS Markit. Microsoft and Facebook said they’re reviewing the pledge. Google declined to comment.
“There are going to be some large vendors who refuse to sign or are reluctant to sign because they want these government contracts,” said Laura Moy, executive director of the Center on Privacy & Technology.
The use of facial recognition for surveillance, policing and immigration is being questioned because researchers, including Buolamwini, have shown the technology isn’t accurate enough for critical decisions and performs worse on a certain section of people.
Providers have responded differently to the scrutiny. Microsoft is defending government contracts generally, while asking for laws to regulate the space. Amazon took issue with research by the ACLU into the Rekognition programme it sells to police departments, but the company has also said it’s working to better educate police on how to use the software.
Companies including Microsoft, Facebook and Axon, a maker of police body cameras, have formed AI ethics boards and Google published a set of more-general AI principles in June.
The Safe Face Pledge asks companies to “show value for human life, dignity and rights, address harmful bias, facilitate transparency” and make these commitments part of their business practices. This includes not selling facial recognition software to identify targets where lethal force may be used. The pledge also commits companies to halt sales of face AI products that are not “subject to public scrutiny, inspection, and oversight.”
There are also commitments to internal bias reviews as well as checks by outside experts, along with a requirement to publish easy-to-understand information on how these technologies are used and by which customers. Startups Simprints Technology, Robbie AI Inc. and Yoti Ltd. were the inaugural signers of the pledge.
“It’s kind of the wild west when it comes to use of automated facial analysis technology, and it’s also an area that’s shrouded in secrecy,” Moy said. The Safe Face pledge tries to address both areas, but Moy also believes new laws are needed.
That’s where Microsoft is focussing its attention. The company detailed the laws it would like to see passed. Microsoft President and Chief Legal Officer Brad Smith put the chances of federal legislation in 2019 at 50-50, most likely as part of a broader privacy bill. But he said there’s a far better shot at getting something passed in a state or even a city next year. If it’s an important enough region, say California, that would probably be enough to make software sellers change their products and practices overall, he said.
Microsoft said it will turn down some AI contracts where it has concerns, and already has. Smith wouldn’t specify which deals it has rejected, and he has also said Microsoft will continue to be a key vendor to the US government.
“We’ve turned down business when we thought there was too much risk of discrimination, when we thought there was a risk to the human rights of individuals,” Smith said.
In contrast, Amazon thinks it’s too soon to regulate. “There are many positive and important uses of this technology that are being implemented today, to include preventing human trafficking, reuniting missing children with their parents, and improving security,” the company said.
Google’s cloud unit to not sell a type of facial recognition tech
Google said its cloud business won’t sell a general type of facial recognition software until questions around the controversial technology have been answered.
“Facial recognition merits careful consideration to ensure its use is aligned with our principles and values, and avoids abuse and harmful outcomes,” Kent Walker, Google’s head of global affairs said. “Unlike some other companies, Google Cloud has chosen not to offer general-purpose facial recognition APIs before working through important technology and policy questions.”
Google’s pronouncement comes as a rare consensus forms among
researchers, activists and lawmakers that facial recognition can be biased, support mass surveillance and should be regulated.
Alphabet Inc.’s Google is in an arms race with other tech giants to develop the best artificial intelligence and sell it in the form of services delivered over the internet.
But some of Google’s work has sparked protests by employees, activists and privacy experts. The company let a Pentagon AI contract lapse this year after an outcry.
Google’s blog didn’t specify what questions it wants answered before it will proceed, or say if certain customers will remain off-limits.