Earlier this year, Google CEO Sundar Pichai described artificial intelligence as more profound to humanity than fire. Thursday, after protests from thousands of Google employees over a Pentagon project, Pichai offered guidelines for how Google will—and won’t—use the technology. One thing Pichai says Google won’t do: work on AI for weapons. But the guidelines leave much to the discretion of company executives and allow Google to continue to work for the military.
The ground rules are a response to more than 4,500 Googlers signing a letter protesting the company’s involvement in a Pentagon project called Maven that uses machine learning to interpret drone surveillance video.
The dissenting employees asked Google to swear off all military work. Pichai’s response? We hear you, but you can trust us to do this responsibly. “We will continue our work with governments and the military in many other areas,” Pichai’s post says. “These collaborations are important and we’ll actively look for more ways to augment the critical work of these organizations and keep service members and civilians safe.”
That could be read as allowing continued work on Project Maven. But Google told employees last week that it would not renew the Maven contract when it expires next year. And a spokesperson said Thursday that if Maven came up again today the company likely wouldn’t participate, because the project doesn’t follow the spirit of the new guidelines.
Although prompted by the protest over Maven, the guidelines posted Thursday address a much broader range of concerns. Google pledges to avoid creating systems that reinforce societal biases on gender, race, or sexual orientation, for example. They say privacy safeguards should be incorporated into AI technologies, which often gain their power from training on vast data sets like those Google holds from its billions of customers.
“How AI is developed and used will have a significant impact on society for many years to come,” Pichai writes in an introduction to the guidelines. “As a leader in AI, we feel a special responsibility to get this right.” Google also released a set of “recommended practices” on topics such as fairness and testing AI systems to help other companies use and develop AI responsibly.
Project Maven is a broad effort at the Department of Defense to inject more artificial intelligence into operations. The first project involved using AI to track objects such as buildings, vehicles, and people in drone video. Google’s precise role in the project is not clear, but the company has maintained that it was limited to “non-offensive purposes.”
Google’s new guidelines build on that line, saying it won’t apply AI to weapons or “other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people.” The document also states that Google won’t work on surveillance technology “violating internationally accepted norms of human rights.”
One Google employee reached Thursday before the guidelines were released said that any such rules would be hard to trust if only interpreted and enforced internally. External oversight would be needed to reduce the risk of business concerns skewing decision making, the person argued. Google is bidding for a multibillion-dollar Pentagon cloud computing contract called JEDI.
Peter Eckersley, chief computer scientist at the Electronic Frontier Foundation, agrees that Google should get outside help. “If any tech company is going to wade into a morally complex area like AI defense contracting, we'd recommend they form an independent ethics board to help guide their work,” he says. “Google has a real opportunity to be a leader on AI ethics here, and they shouldn't waste it.”
In a letter to Google cofounder Sergey Brin released Thursday, Faisal ali bin Jaber, who says his brother-in-law Salem was killed by a US drone strike in Yemen, offered himself as an external ethics adviser to Google on future defense work. “Google can protect people like Salem rather than making it easier to kill them,” he wrote. “Let us discuss how Google can set ethical standards in this area for other companies to follow, rather than be co-opted by government.”
Spread the word