A new report from +972Mag and Local Call, two Israeli news agencies, revealed that since October 7th, Israeli forces have been utilizing a new AI tool called ‘Lavender’ to target tens of thousands of Palestinians in Gaza for assassination since October 7th.

The Israeli news agency spoke with Israeli intelligence officers who spoke on condition of anonymity. “According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine ‘as if it were a human decision.'”

“Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.

“During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.

“Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.”

The detailed report, compiled by reporter Yuval Abraham, details the process of the Israeli military’s ‘artificial target generation’ in six parts. First, the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, the “Where’s Daddy?” system, which tracked these targets and signaled to the army when they entered their family homes. Third, the use of “dumb” bombs were chosen to strike these homes. Fourth, the Israeli military’s decision to loosen the permitted number of civilians who could be killed during the bombing of a target. Fifth, a description of how automated software inaccurately calculated the amount of non-combatants in each household. And sixth, a demonstration of how, on several occasions, when a home was struck, usually at night, the individual target was sometimes not inside at all, because military officers did not verify the information in real time.

 

Facebooktwitterredditpinterestlinkedintumblrmail

By admin