What happens when AI Automates Killing 

Part 1 of 3 on AI and Societal Impacts 

The Dream

All nightmares begin as dreams. Imagine it’s the mid 1950’s and a group of guys set out to create a new kind of intelligence, one based on computer logic. A computer scientist, John McCarthy, at a Dartmouth Conference coined the term ‘artificial intelligence’, or AI for short.  Artificial intelligence is, in some ways, an unfortunate name because this form of intelligence is as human as anything else that our species has created. There is nothing artificial about it. The outcomes of the original conference have been nothing short of astounding. 

ChatGPT (Chat) was released by OpenAI on November 30, 2022. It is based on what are known as large language models or LLM’s. These models are, in turn, based on natural language that we use every day. They are symbolic logic and subject-predicate sentences and reasoning. Generative AI, which is emergent, refers to the fact that this form of AI can reason for itself outside of human intervention. It’s like having a smart neighbor next door who comes up with better ideas or data than you. Instead of ‘garbage in, garbage out’, as old computer models dictated, it’s garbage in, and better, more useful information out. 

Aerial view of Microsoft’s new AI datacenter campus in Mt Pleasant, Wisconsin.

Chat, like all new technologies, amplifies our abilities to impact the world. It can process information, handle complex questions, make predictions and structure results. It’s not just a ‘tool’ like a hammer, for example, but a special tool with cognitive capabilities and complex feedback. 

Query On ‘Evil’

A few days ago, with the help of a friend, Stephan, I queried Chat to help me understand evil, and how Chat self-referenced the concept and its application. My queries were:

Chat immediately took the query, thinking for a moment. Then, it began to carry on a conversation with me which stated, in an abbreviated form, the following:

Lavender or The Gospel AI

Israel’s war against Palestinians in Gaza is the first war in human history to utilize AI to kill masses of people, 80% of whom are civilians, women and children. Israel’s AI, code named Lavender or The Gospel, are structured and oriented for rapid generation of targets, and to promote executions of anyone or anything identified. They are based on machine learning algorithms, data probabilities and associations that are designed to destroy buildings or people. It is fundamentally different from Chat, but AI nonetheless. 

What makes them fall under the rubric of AI is that they are able to process large amounts of data from multiple sources, ‘reason’ on probabilities and almost instantaneously draw conclusions based on inferences. They were created for war.

Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)

As ‘tools’, they are highly appealing due to their perceived lack of moral constraints—decisions are based on data and technology; low coefficient of resistance—they enable action without friction; outcomes somehow belong to a third party— nothing personal, no moral consciousness required.   

Lavender and The Gospel incorporate three components, which are interrelated as part of a system:

In one situation, for example, the identified subject for kill walked out the back door to avoid being killed by Israeli Defense Forces. Instead, all 12 members of his family who lived at the identified location were killed, not the suspect. Kill, not executed. 

The Israeli Defense Forces (IDF), in order to make its AI effective, has loosened the acceptable rules of engagement and levels of collateral damage. For senior commanders, for example, it allowed for triple digit civilian deaths to military targets; and this assumes that the military targets are accurate. This has included the proliferation of non-military targets. 

The Israeli military has invented the category of “power targets”. These are targets that have no military value but are designed to shock Palestinians and drive them to influence Hamas. Leveling high rise buildings, shopping centers, hospitals, and schools can qualify as power targets. If children are part of the kill, that’s just too bad. According to Israeli military sources, all of this is known beforehand including members of a household.  

Chat Judging Lavender & The Gospel

According to my inquiry on evil with Chat, both Lavender and The Gospel are evil. They are geared for destruction, participating in the killing of over 75,000 people, and 19,000 children, with thousands more under rubble. It is beyond any kernel of truth to say that these killings were based on defense. The defense that Hamas was everywhere and anywhere, necessitating killing all civilians, including kids shot at close range, is beyond the rationality of Chat or basic human morality. 

Members of the Israeli IDF and government should be charged with genocide or minimally, crimes against humanity. American firms that are supporting AI in Israel should be charged as well—–Palantir, Google, Microsoft and a little known but powerful company, Ondas Holdings, West Palm Beach Florida. Palantir, in spite of its support for genocide in Israel, supplies New York City hospitals with software to review patients records. They are currently the subject of protests calling for them to be removed. 

Smoke billows after an Israeli strike on north Gaza on November 22, 2023. Israel says it is using artificial intelligence to find targets. JOHN MACDOUGALL / AFP via Getty Images

Chat’s determination is similar to that of human groups for justice. Israel is evil in its activities in Gaza against Palestinians as defined by AI. It is also busy exporting evil to the U.S., like the Departments of War’s contract to purchase $210 million of advanced cluster shells for bombs from Israel. Israel dropped cluster bombs in Lebanon in violation of UN law. As of 2025, 120 nations have signed a treaty that bans the production, transfer and use of cluster bombs. Cluster bombs stay alive in fields for years, wounding or killing people after conflicts end. 

Call and/or email any of these organizations and tell them to stop supporting genocide in Gaza. The more calls, the better.  

Leave a Reply

Your email address will not be published. Required fields are marked *