Enlarge image

Bombing in the Gaza Strip: “Just tools for analysts in the target identification process”?

Photo: PATRICK BAZ/ AFP

Artificial intelligence is said to have “saved a lot of time” when selecting possible targets in the Gaza war, says an Israeli officer about a new, previously unknown program. "I didn't have to do anything as a human being except put my stamp of approval on it." Another explained that a statistics program was better than a grieving soldier because "everyone here, including me, lost people on October 7th. The machine made it cool and that made it lighter.”

The statements were first published by the Guardian. They come from six unnamed Israeli intelligence officers, all of whom say they use AI systems to identify Hamas targets. According to the Guardian, journalist Yuval Abraham published a corresponding report in the +972 Magazine.

The AI ​​system is called Lavender. It is used to quickly process large amounts of data and identify potential human targets, primarily low-ranking Hamas fighters. Four of the six Israelis said that Lavender had now identified 37,000 Palestinians who were said to have connections to Hamas or the Palestinian Islamic Jihad (PIJ).

“Gives us more targets”

The system was developed by Unit 8200, Israel's elite state hackers. It accelerates what used to be a much more labor-intensive process: the selection of legitimate targets. “Give us more targets,” was the announcement from the military leadership, the sources said. In order to comply with this, they increasingly relied on lavender. According to Unit 8200, the system has a hit rate of 90 percent. Lavender was used in addition to another AI system called The Gospel, which suggests buildings and other structures as targets.

How Lavender was trained and how it works in detail is not clear from the statements. It is also unclear whether the informants have enough technical knowledge to correctly assess Lavender.

According to their statements, the use of Lavender at the beginning of the war was a kind of blank check for certain numbers of civilian casualties that Israel's military accepted. Two of the sources said they were allowed to kill between 15 and 20 civilians in airstrikes against lower-ranking militants. Corresponding attacks were carried out with so-called “stupid bombs” that destroyed entire houses and killed everyone in them.

Israel's military partially contradicts the statements

According to the report, a source said: "When it comes to militants in low positions, you don't want to invest human resources or time." They are therefore prepared to accept the error rate of artificial intelligence, i.e. "collateral damage and dying civilians" - "and to live with it."

The Israeli army partially contradicted the statement. "The IDF does not use an AI system that identifies terrorists or tries to predict whether a person is a terrorist." The systems used are "merely tools for analysts in the target identification process." Lavender is not a system at all, just a database for cross-references from intelligence sources "to produce up-to-date information on military operations of terrorist organizations." These are not confirmed officials who are likely to be attacked.

The use of “dumb bombs” takes place within the framework of international law and in a manner that ensures “a high degree of precision.”

pbe