The impact of the occupation on Palestinian's lives is already horrible—forced segregation, daily humiliation at military checkpoints, and the constant fear of being displaced due to illegal settlements invading people's homes.
So why is Microsoft enabling a brutal occupation by funding AnyVision—a facial recognition company which helps the Israeli army surveil Palestinians’ every move?
Microsoft engages in a lot of talk about practicing good tech ethics—but since it started shelling out millions to fund invasive tech targeting Palestinians—they’ve remained curiously silent. 1
Microsoft has made a lot of noise recently about how they’re the one tech giant taking ethical concerns about facial recognition & other surveillance technology seriously. They widely publicized a set of six “moral standards” they’d apply to funding and developing any such tech.1
Microsoft’s multi-million dollar investment in AnyVision was made on the condition that AnyVision agree to those six principles. A month later, it came out that the Israeli army is using AnyVision’s facial recognition tech throughout the West Bank.2
A description of one of Microsoft’s “moral standards”:
“We will not deploy our facial recognition technology in those surveillance scenarios where we believe there are inadequate safeguards to protect democratic freedoms and human rights.” 3
But under Israeli occupation, there can never be adequate “safeguards” to protect the human rights of Palestinians. Especially when facial recognition software is already used worldwide to surveil and further criminalize the communities being surveilled.4
And there’s nothing ethical about enabling a system to expand and improve its surveillance of an imprisoned people.
Microsoft’s response to reporters asking if AnyVision’s enabling of Israel’s occupation machine violates their own stated principles? Essentially: No comment.5
AnyVision is profiting from Israel’s violations of Palestinian human rights while it exports its repressive surveillance technology abroad. This isn’t just about standing in solidarity with the Palestinian people—who have been living under brutal occupation for decades. It’s also about taking a stand against surveillance technology increasingly being used to repress communities worldwide.
The same surveillance tech & practices used against one set of peoples are often used globally against other frontline communities—in some cases the intersections are explicit, as in the thousands of U.S. police officers trained by the Israeli military, police and the Israel Security Agency, or Shin Bet.6
Many of us use Microsoft products at work and at home. As consumers, we have the ability & responsibility to impact Microsoft’s brand. If they want us all to buy that they’re an ethical leader in Artificial Intelligence, they must immediately ditch AnyVision.
Microsoft is betting on consumers not paying attention. But we are—and you should be too.
Let Microsoft know this can’t stand.
Sources:
1. "Microsoft Slammed For Investment In Israeli Facial Recognition ‘Spying On Palestinians’,” Forbes, 1 Aug 2019
2. "Human Rights Groups Slam Microsoft for Investing in Israeli Face-recognition Company," Haaretz, Aug 4 2019
3. "Six Principles to Guide Microsoft's Facial Recognition Work", Microsoft, Dec 17 2018
4. "Don't Regulate Facial Recognition. Ban it.", BuzzFeed, July 18 2019
5. "Microsoft-backed Facial Recognition Firm Rethinks its Role in Hong Kong," Fast Company, Aug 30 2019
6. ""Deadly Exchange Report Reveals Extent of Massive Training Programs Between U.S. Law Enforcement and Israeli Police, Military and the Shin Bet," Jewish Voice For Peace, Sept 12 2018
In partnership with: