Home Features “G” is for Genocide: Google and Amazon’s Project Nimbus

“G” is for Genocide: Google and Amazon’s Project Nimbus

Unravelling Google and Amazon’s $1.2 billion cloud deal with the Israeli government

0
ILLUSTRATION: Emma Nash / The Peak

By: Sofia Chassomeris, News Writer

Artificial intelligence (AI) isn’t a sci-fi pipe dream anymore. For nearly 75 years, computer scientists have researched and developed programs that can mimic and predict human thinking — we currently stand witness to this new technological frontier. AI technology has undeniably furthered progress in many areas such as medical efficacy, agricultural optimization, and cyber security. The recent advancement of “deep-learning” AI models is especially promoted as revolutionary for its ability to “process extremely large and varied sets of unstructured data and perform more than one task.”

As an emerging phenomenon, AI is only beginning to be regulated, with Canada issuing its Artificial Intelligence and Data Act just last year. The United Nations (UN) General Assembly also recently adopted a resolution for the regulation of AI in 2024 to operate in accordance with international human rights law. However, this resolution isn’t legally binding for member states and stakeholders, and merely a recommendation “urged” by the UN. Perhaps it would be an appropriate measure if it could actually be enforced. Maybe then it wouldn’t feel like too little too late. However, money and power are twin harbingers of injustice. They are both the end and the means of all kinds of destruction, and AI is only the newest vehicle for its facilitation.

The Nimbus Project is a $1.2 billion deal signed in 2021 which has since provided the Israeli government and military with its own secure and private cloud computing infrastructure and advanced AI technology. The project was a joint venture between Google and Amazon, companies which ranked fourth and fifth (respectively) in Forbes Top 10 Largest Global Companies by their total value in 2024. As written in an anonymous letter from Google and Amazon employees in coalition with the advocacy organization No Tech for Apartheid, “This technology allows for further surveillance of and unlawful data collection on Palestinians, and facilitates expansion of Israel’s illegal settlements on Palestinian land.” 

Surveillance is not new for Palestinians, and it has only gotten worse. For years Israel has tested and used AI facial recognition software to identify and track individuals, monitored their computers, phone calls, and employed the use of Pegasus spyware on political adversaries. “For the Israeli government, this surveillance regime is both a tool of control and a money-making business,” writes an Al Jazeera article. The author, Jalal Abukhater, likens the Gaza Strip and West Bank to a lab where Israel can trial spyware and surveillance technology before putting it on the global market. 

The Nimbus Project not only provides the Israeli military with technology for their current projects, but also allows them to increase surveillance and control of Palestinians. AI systems like Lavender or The Gospel are used to recommend targets suspected of Hamas or political association, as well as locate them within their homes. However, Lavender is known to make errors, and targets are not thoroughly verified. The risks posed by AI inaccuracy due to data bias make the use of these systems increasingly dangerous. If the dataset used for its training is unreliable and non-representative, the AI will make biased decisions — every step in the process of training these models from data collection, labelling, and the employment of the AI afterward will influence its output. 

An article from +972 and Local Call, an independent publication made up of Palestinian and Israeli journalists, states that “a fundamental difference between the two systems is in the definition of the target.” They explained Lavender generates a “kill list” of individuals, while The Gospel on the other hand marks whole buildings the Israeli military suspects militant operation from. This designation has often resulted in the annihilation of residential areas and civilians. Additional AI systems such as the Where’s Daddy? software specifically tracks alleged militants to their homes before bombing them with the objective of killing the entire family. These AI systems only generate lists of potential targets, overseen by soldiers with little to no concern for its accuracy. Issues of misidentifying targets as well as timing discrepancies between tracking and carrying out attacks has had catastrophic consequences for Palestinian civilians — often disregarded as “collateral” damage.

Software engineers, data scientists, and many other employees of tech giants Google and Amazon spoke publicly against the Nimbus Project for its complicity in the Palestinian genocide. As stated on the No Tech for Apartheid website, over a thousand employees agree that “technology should be used to bring people together, not enable apartheid, ethnic cleansing, and settler-colonialism.” These companies have made their positions clear. When protests began concerning the Nimbus Project, Google fired the employees involved and doubled down on the deal.

Is it terrifying that the CEO of a trillion dollar company has such a severe deficit of moral integrity? Definitely. Is it surprising? Absolutely not. Greed for money and power only deepens systemic injustice experienced, which is why making support for the genocide financially unsustainable the only real solution to erradicating it.

Organizations like the BDS movement which call for the boycott, divestment, and sanctions of those in support of Israel’s apartheid proudly follow the lead of the South African anti-apartheid movement. The international effort to boycott and divest from companies that supported South Africa’s apartheid, as well as sanctions from the country’s major trading partners and general strikes and protests were all crucial to ending the violent regime. When corporations like Google or Amazon profit from enabling colonial violence, it’s imperative we refuse their products and services, seek alternatives, raise awareness and continue pushing for corporate accountability. While it might seem impossible to untangle our lives from companies like Google, using alternatives like Ecosia or open-source software such as Dropbox and LibreOffice are meaningful forms of boycott. Perhaps most importantly, petitions and initiatives that call for regulation and ethical practices of tech giants and the usage of their products and services could be our strongest option. If bloodshed cannot change their minds, the bottom line will.

NO COMMENTS

Leave a ReplyCancel reply

Exit mobile version