The world has been experiencing a never-ending cycle of war and violence since the beginning of time. If the ultimate goal is for everyone to live in peace and prosperity, then the endless episodes of conflicts in the echo chamber of humanity have left countless lessons to be learned. Still caustically, we are heading towards an irreversible arena of the battlefield with a greater possibility of the most sinister version of warfare that mankind has ever experienced. This brings to mind Geoffrey Hinton, the Godfather of AI, who famously said that we are entering a moment when, for the first time, we may have things more intelligent than us.
The very idea of a computerized and AI version of genocide by right-wing fundamentalists sitting in Tel Aviv was not committed coincidentally. Concerning this, the latest crux of war has reversed the primary principle of post-Westphalian society. The institutions are no longer respected, and human lives are no longer valued, indicating the potential collapse of a hopeful world order into the realm of peace.
Moreover, the recent documents divulged by the Washington Post have left no doubt about how Israel inflicted a massive-scale bloodbath in Gaza remotely with the help of an advanced model of artificial intelligence provided by Google. This fiasco occurred when the company signed Project Nimbus of 1.2 billion dollars with the government of Israel. The bilateral move was desperately opposed by dozens of Google staff workers who were then fired from the job.
No tech for Israel’s apartheid; we won’t work for genocide; these were the words written on a banner held by workers of Giant Tech during a protest in New York City. Nevertheless, it had appeared that Google hastily assisted cloud computing services to Israel’s defense ministry, which subsequently authorized their use to target the entire population of Gaza indiscriminately.
Assuredly, it would not be wrong to say that the pro-Israelis on social media repeatedly appreciated the destruction of Gaza. Israeli settlers seemingly mocked Palestinians on the water shortage by making TikTok reels, reflecting a harsh and unethical reality. Importantly, it must not be overlooked how the Israeli battalions planted explosives to destroy residential buildings and shared the footage on social media, showcasing their inhumane actions against the Palestinians to the rest of the world. All of these excruciating events paint a clear picture of a thanatomanic society, where death and destruction are openly celebrated and dehumanization is ingrained in its very fabric, maybe in DNA.
Now let’s break down the complex concept to understand how Israel used plenty of software to target schools, hospitals, apartments, and even refugee camps since the outbreak of war in Gaza. There are a wide range of programs with different functioning styles, albeit the purpose is only the ethnic cleansing of the human population. The people in Gaza and the West Bank are under the mercy of a merciless monster who monitors their every single move with a vast web of mass surveillance.
AI-powered databases like Gospel, Lavender, and Where’s Dady are the deadliest technological weapons used by the IDF for military purposes during the war. These sophisticated AI-driven systems play a pivotal role in processing the data to identify the target based on racial profile. The Lavon Affair, which Tel Aviv claimed was designed to identify Hamas, instead led to the unprecedented bombing of Palestinians in a single space. The output of this AI programming in the form of assassination lists is regarded as an accurate human decision during military operations. It effectively implies that the machine itself determined and recommended that these Palestinians be killed, labeling them as the worst enemies from a militaristic point of view.
The duo of Lavender and the gospel are barbarous weapons of war and not far behind in the race of cybercriminalization. A fundamental difference between the two systems is in the definition of the target. The Gospel marks buildings and structures that the army claims militants operate from, whereas Lavender marks people and puts them on a kill list. This algorithm, applied to the general population, inaccurately includes civilians as suspects based on communication patterns and facial expressions that somehow resemble those of operatives. On the contrary, the rest of the world has apprehended that burning Palestinians is not a flaw in technology; it is a feature.
Furthermore, the Israeli military systematically bombarded the marked individuals while they were at home with their whole families, usually at night. The automated system termed ‘Where’s Daddy’ was specifically used to pinpoint suspects and carry out bombings when they entered their family’s residencies. This highlights the extent to which these systems can become horrific instruments of destruction for innocent lives. The only question that remains is how justice will prevail, given that the blood of Palestinians was erased by an invisible robotic hand.
In conclusion, the emergence of advanced mechanisms of warfare has raised serious concerns about the future trajectory of human conflict, particularly as they bypass legal and moral accountability. Now more than ever, it is crucial to establish international standards and regulations to uphold human dignity. As we move forward into an era of rapid technological innovation, we must chart a course that ensures these advancements serve the global good, rather than contributing to mass destruction. Given the immense human cost of previous conflicts, if the lessons from two world wars are not enough, one must reflect on Einstein’s famous words: “I do not know what weapons World War III will be fought with, but World War IV will be fought with sticks and stones.” Yet, if we fail to learn from history, even sticks and stones may cease to exist.