Google sold “emotion detection” AI to Israel
Jerusalem24 – The Intercept – Google is offering advanced artificial intelligence (AI) and machine-learning capabilities, including so-called “sentiment detection”, to the Israeli government through its “Project Nimbus” contract.
Google and Amazon landed a joint $1.2 billion contract to provide cloud computing services to the Israeli government in April 2021. A new tender for “Layer 2” of the project is currently on offer on the Israeli Government Procurement Administration website.
Google employees have condemned the contract and expressed concern that the technology they developed would be used to intensify surveillance of Palestinians and aid the ongoing Israeli occupation.
The Intercept analyzed training documents and videos intended for Nimbus users which show that Google is providing the Israeli government with the full suite of machine-learning and AI tools available through Google Cloud Platform.
Many of the capabilities outlined in the documents obtained by The Intercept could easily augment Israel’s ability to surveil people and process vast stores of data – already prominent features of the Israeli occupation.
While they provide no specifics as to how Nimbus will be used, the documents indicate that the new cloud would give Israel capabilities for facial detection, automated image categorization, object tracking, and even “sentiment analysis” that claims to assess the emotional content of pictures, speech, and writing.
[box type=”shadow” align=”” class=”” width=””]What is “sentiment detection”?
Sentiment detection [is] an increasingly controversial and discredited form of machine learning. Google claims that its systems can discern inner feelings from one’s face and statements, a technique commonly rejected as invasive and pseudoscientific, regarded as being little better than phrenology. In June, Microsoft announced that it would no longer offer emotion-detection features through its Azure cloud computing platform — a technology suite comparable to what Google provides with Nimbus — citing the lack of scientific basis.
Source: The Intercept[/box]
“A natural fit for military and security applications”
Google workers shared with The Intercept particular concerns over the Cloud Vision API technology being sold to Israel: “Vision API is a primary concern to me because it’s so useful for surveillance,” said one worker, who explained that the image analysis would be a natural fit for military and security applications. “An AI can comb through collected surveillance feeds in a way a human cannot to find specific people and to identify people, with some error, who look like someone. That’s why these systems are really dangerous.”
Google has placed some limits on Vision – for instance limiting it to face detection, or whether it sees a face, rather than recognition that would identify a person. The AutoML AI tool however, which Google offered to Israel through Project Nimbus, would allow Israel to leverage Google’s computing capacity to train new models with its own government data for virtually any purpose it wishes.
In one Nimbus webinar reviewed by The Intercept, the potential use and misuse of AutoML was exemplified in a Q&A session following a presentation. An unnamed member of the audience asked the Google Cloud engineers present on the call if it would be possible to process data through Nimbus in order to determine if someone is lying. “I’m a bit scared to answer that question,” said the engineer conducting the seminar, in an apparent joke. “In principle: Yes. I will expand on it, but the short answer is yes.”
Spurious “ethical” concerns
“Google’s machine learning capabilities along with the Israeli state’s surveillance infrastructure poses a real threat to the human rights of Palestinians,” Damini Satija, who leads Amnesty International’s Algorithmic Accountability Lab, told The Intercept.
While companies like Microsoft have pulled certain AI technologies – including “emotion-detection” features – over purported ethical concerns, specific provisions in Israel’s contract with Google prevent the company from shutting down Israel’s use of its products even if Google wishes to do so in the future.
Jathan Sadowski, a scholar of automation technologies and research fellow at Monash University, told The Intercept: “The way that Israel is locking in their service providers through this tender and this contract, I do feel like that is a real innovation in technology procurement.”
Critics of Google’s contract with Israel have repeatedly pointed out that the innocuous-seeming tender for “cloud computing services” may foster other ambitions.
According to Liz O’Sullivan, CEO of the AI auditing startup Parity and a member of the US National Artificial Intelligence Advisory Committee, “Countries can absolutely use AutoML to deploy shoddy surveillance systems that only seem like they work. On edge, it’s even worse – think bodycams, traffic cameras, even a handheld device like a phone can become a surveillance machine and Google may not even know it’s happening.”
The Intercept concludes its investigation by quoting Sadowski: “These are not technologies that are just neutral intelligence systems, these are technologies that are ultimately about surveillance, analysis, and control.”