The Israeli military is aggressively disputing reports that it has relied on an artificial intelligence system for a targeted killing program that tolerates civilian deaths as acceptable collateral damage in its war against Hamas. Explosive allegations that Israel has a secret AI-powered killing machine called “Lavender” spread on Wednesday in a pair of news reports citing anonymous intelligence sources involved in the Hamas-Israel war.
The Israel Defense Forces said Wednesday evening that it does not use AI to designate people as targets for military strikes. “Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” the IDF said in a statement posted to its website. “Information systems are merely tools for analysts in the target identification process.”
The statement came after +972 Magazine and The Guardian reported that Israel was allowing the AI system to direct human analysts’ judgment in a rush to fight back against Hamas soon after the Oct. 7 attack on Israel.
Full details on the use of AI in the Hamas-Israel war may not emerge anytime soon, but the IDF has acknowledged some of its AI capabilities on its website. The online publication SpyTalk reported in February on the IDF website’s mention of an AI targeting system dubbed “Gospel.” The publication claimed the system was proving lethal to Gaza civilians.
The developments come days after an Israeli airstrike in Gaza this week killed seven international aid workers, a strike Israeli officials said was a mistake. With three U.K. citizens among those killed, more than 600 British jurists, including three retired judges from the U.K. Supreme Court, called on the government Wednesday to suspend arms sales to Israel. Diplomatic friction is separately brewing between Poland and Israel over a Polish aid worker who was also killed in the strike.