WANA (Jul 01) – The Israeli regime uses artificial intelligence systems in its operations that carry out assassinations not based on precise information, but on probability.

 

“Lavender” is an AI-based decision support system that the Israeli regime has used to identify and assassinate individuals in Gaza and possibly Iran.

 

The Lavender system employs AI algorithms for the purpose of assassination, possessing the ability to analyse massive amounts of information on a large number of individuals (millions of people) and, based on the connections between people and military organisations, identifies individuals as targets.

 

This system collects the communication data of citizens and ordinary people, including phone calls, locations, internet usage, and even the frequency of SIM card changes.

 

After collecting these data, it analyses individuals’ networks and their connections to each other.

 

The key assumption of the system is this: “If someone is in contact with known members, such as Hamas members, behaves similarly to them, or has a pattern close to theirs, they are probably a Hamas member as well.”

 

For example, a person who frequently changes SIM cards, only communicates with a specific circle of contacts, or lives close to suspected individuals is probably also to be added to the assassination list.

 

As a result of analysing this data, Lavender daily generates a list of thousands of human targets. This assassination system even prioritises these human targets by threat level.

 

Finally, the list is passed on to air or drone units to carry out the strikes. According to various reports, on some days, Lavender has produced lists containing more than 10,000 people.

 

Even Western sources criticise this terrorist system, stating that operators receiving the list have only 20 seconds to decide whether the person in question should be killed or not. The system’s accuracy is said to be around 90%, but when we’re talking about thousands of people, a 10% error rate means hundreds of civilian deaths.

 

AI systems like Lavender, by analysing big data, assign each person a probability of membership or a threat coefficient, and based on that, identify them as a target. In other words, assassination operations are carried out purely based on probability.

 

According to consecutive investigative reports from various media outlets, this system was developed by the intelligence unit of Israel’s Unit 8200 (a branch of the Israeli military similar to the US NSA).

 

It is worth noting that in 2023, the Israeli regime carried out an operation called “Iron Sword.” The goal of this operation was to identify and assassinate members of the Islamic Resistance Movement (Hamas) and Palestinian Islamic Jihad, for which the regime used the Lavender system to identify individuals.