Why human agency is still central to Israel’s AI-powered warfare

Following +972’s 'Lavender' exposé, international law and AI experts explain how Israel's top brass and global tech firms are implicated in the slaughter.

Israeli soldiers from the 8717 Battalion of the Givati Brigade operating in Beit Lahia, in the northern Gaza Strip, December 28, 2023. (Yonatan Sindel/Flash90)
Israeli soldiers from the 8717 Battalion of the Givati Brigade operating in Beit Lahia, in the northern Gaza Strip, December 28, 2023. (Yonatan Sindel/Flash90)

The destruction that Israel has ravaged upon Gaza evokes an analog era of warfare. Craters swallow residential complexes, whole streets have been reduced to rubble, and clouds of dust block out the sun. Israel’s military has dropped more explosives on the 124-square-mile enclave than were contained within the atomic bombs that decimated Hiroshima and Nagasaki in World War II. The scale and density of the destruction rivals the most devastating episodes of urban warfare in recent history, from the Blitz of London to decades of counterinsurgency in Vietnam. 

Yet in contrast to those 20th-century wars, Israel’s assault on Gaza is a fundamentally high-tech killing campaign. Earlier this month, investigative reporting in +972 and Local Call revealed AI’s pivotal role in the bloodshed. According to six Israeli intelligence officers, the military used an AI machine nicknamed “Lavender” to generate tens of thousands of “human targets” for assassination on the grounds that they are allegedly part of the armed wings of Hamas or Palestinian Islamic Jihad. These outputs were then fed into an automated tracking system known as “Where’s Daddy?” enabling the army to kill each one inside their home, along with their whole family and often many of their neighbors. 

These revelations follow an earlier investigation by +972 and Local Call, which shed light on another AI target-generating system known as “Habsora” (“The Gospel”). Whereas Lavender generates human targets, Habsora marks buildings and structures that allegedly serve a military function. One former intelligence officer told +972 that this technology enables the Israeli army to essentially operate a “mass assassination factory.”

The latest investigation made waves in the international press, where commentators conjured scenes of AI-powered weapons systems exceeding the power of their human operators and killing at whim. But experts in international law and AI warfare underlined to +972 that the carnage in Gaza is the result of concerted human decisions. And alongside the top brass of Israel’s military and political establishments, entire swaths of the global civilian technology sector may be implicated in the slaughter. 

Rapid generation, rapid authorization

With the daily death rate in Gaza higher than any other 21st-century war, it appears that commitments to minimizing civilian casualties in targeted assassinations, insofar as they ever existed, simply went out the window. According to the sources, Israeli military officials significantly lowered the criteria used to determine which targets could be killed in their homes, while raising the threshold of civilian casualties permitted in each strike — in some cases authorizing the killing of hundreds of civilians in order to kill a single senior military target. The emphasis, as IDF Spokesperson Daniel Hagari put it in the early days of the war, was “on what causes maximum damage.”

An Israeli airstrike in Al-Bureij refugee camp in central Gaza, January 2, 2024. (Oren Ben Hakoon/Flash90)
An Israeli airstrike in Al-Bureij refugee camp in central Gaza, January 2, 2024. (Oren Ben Hakoon/Flash90)

To be clear, Israel is not relying on fully autonomous weapons in the current war on Gaza; rather, intelligence units use AI-powered targeting systems to rank civilians and civilian infrastructure according to their likelihood of being affiliated with militant organizations. This rapidly accelerates and expands the process by which the army chooses who to kill, generating more targets in one day than human personnel can produce in an entire year

With rapid target generation comes the need for rapid authorization: intelligence officers who spoke to +972 admitted to devoting a mere 20 seconds to sign off on individual strikes, despite knowing that Lavender misidentifies targets — even by its own lax criteria — in approximately 10 percent of cases. Many took to simply ensuring the person they were about to kill was a man, turning most of Gaza into a death trap.

“What struck me from the [+972] report is the degree of autonomy and reliability that the armed forces gave this technology,” Alonso Gurmendi Dunkelberg, a lecturer in international relations at King’s College London, told +972. “It allows the army to coldly sign off on the systematic targeting of a civilian population.”

Ben Saul, an international law professor and UN Special Rapporteur on Human Rights and Counterterrorism, said that overreliance on these systems lends the veneer of rationality to the devastation that Israel has wrought in Gaza. So-called “smart systems” may determine the target, but the bombing is carried out with unguided and imprecise “dumb” ammunition because the army doesn’t want to use expensive bombs on what one intelligence officer described as “garbage targets.” 

“Israel has military lawyers; it has a military justice system; it has operating procedures and rules of engagement which are supposed to help it comply with international human rights,” Saul said. “But this [war] is [operating] far from basic humanitarian rules.”

Palestinians bid farewell to their relatives killed in Israeli airstrikes, at Al-Najjar Hospital in the city of Rafah, southern Gaza Strip, April 21, 2024. (Abed Rahim Khatib/Flash90)
Palestinians bid farewell to their relatives killed in Israeli airstrikes, at Al-Najjar Hospital in the city of Rafah, southern Gaza Strip, April 21, 2024. (Abed Rahim Khatib/Flash90)

The UN, human rights groups, and scores of governments have warned that Israel continuously breaches international human rights law as well as core provisions of the Geneva and Hague Conventions, to which it is a signatory. Each of these treaties prohibits the systematic and deliberate killing of civilians. But legal scholars say these high-tech systems have abetted a systemic disregard for international law over the last six and a half months of war, during which Israel has killed more than 34,000 Palestinians, wounded over 76,000, and as many as 11,000 more remain unaccounted for.  

Turning Palestinians into numbers

That these machines are operated and exploited by actual people has severe implications for Israeli military officials. Lavender and Where’s Daddy? may be billed as AI-powered systems, but even Israeli military heads say they are not acting autonomously: a concerted chain of command dictates how these technologies are put into action. As Zach Campbell, a senior surveillance researcher at Human Rights Watch, told +972, “Yes, this technology is problematic, but it’s also about how these systems are being used. And those are human decisions.” 

Israeli government officials made their intentions clear after the horrific events of October 7. In the early days of the war, Israeli President Isaac Herzog proclaimed there were “no innocent civilians in Gazan,” and cabinet ministers declared that the war was the start of another “Nakba.” Other politicians called for the entire strip to be “flattened.” Two-thousand-pound bombs blasted out entire neighborhoods; bulldozers leveled schools and hospitals, and whole swaths of the Strip were deemed “kill zones.” These commands mapped onto efforts, years in the making, to transform the Israeli army into what sociologist Yagil Levy recently called “a death generating army.”

“The problem isn’t with the AI,” Brian Merchant, a tech reporter who investigates the unmitigated development of AI systems, echoed to +972. “The problem is what the AI lets militaries do. It provides a rationale to be more violent, to be more careless, to assert an agenda they already had or are looking for a pretext to justify.”

Mona Shtaya, a non-resident fellow at The Tahrir Institute for Middle East Policy, said this has long been the case when it comes to Israeli military strategy vis-a-vis Palestinians; Lavender is just the most recent in a long list of algorithmically-powered weapons in Israel’s arsenal. 

Members of the Palestinian Civil Defense search for dead bodies under the rubble after an Israeli airstrike, in the city of Rafah, southern Gaza Strip, April 21, 2024. (Abed Rahim Khatib/Flash90)
Members of the Palestinian Civil Defense search for dead bodies under the rubble after an Israeli airstrike, in the city of Rafah, southern Gaza Strip, April 21, 2024. (Abed Rahim Khatib/Flash90)

For example, predictive policing algorithms and facial recognition systems cull through troves of data pulled from numerous sources, including social media, cell phone data, and drone footage. Like Lavender, these systems use the data to assign Palestinians a security rating. That rating can then determine anything from who should be detained at a checkpoint in Hebron, arrested outside of Al-Aqsa Mosque, or killed in a drone strike in Gaza. 

“These systems turn Palestinians into numbers,” Shtaya told +972. “They allow the authorities to rate us, to dehumanize us, to not think about the fact that we are people, but to justify our death based on a statistic. It’s why we’ve seen violence increase since Israel started relying on these systems.”

In Shtaya’s view, AI-powered targeting systems are the natural outcome of Israel’s unrestrained investment in mass surveillance. “It’s the cycle of tech development in Palestine. Each system is more dangerous.” 

An algorithmic supply chain

The abuse of AI may be rooted in military policies, but it also implicates broad swaths of the civilian technology industry

AI-powered targeting systems rely on troves of surveillance data extracted and analyzed by private start-ups, global technology conglomerates, and military technicians. Tech workers in Silicon Valley office complexes engineer the Google Image databases Israeli troops use to detain civilians fleeing aerial bombardment. Content moderation algorithms determined by Meta’s corporate leadership in New York help predictive policing systems sort civilians according to their likelihood of joining militant groups. Security firms headquartered in Petah Tikvah transfer the contents of mobile phones to military technicians building assassination lists.

Israel’s reliance on civilian technology products to carry out its lethal operations stands at odds with many of the policies and terms of use issued by the companies they collaborate with. Last month, the New York Times revealed that the Israeli army is using a Google Images database to identify and sort civilians across the Gaza Strip. Cheyne Anderson, a Google software engineer and member of the group No Tech for Apartheid, a coalition of tech workers opposed to contracts with Israel’s military, told +972 that this is a serious misuse of Google’s technology. 

Google and Amazon workers protest against their companies' collaboration with the Israeli military at the annual Amazon Web Services summit in New York, July 26, 2023. (X/No Tech For Apartheid)
Google and Amazon workers protest against their companies’ collaboration with the Israeli military at the annual Amazon Web Services summit in New York, July 26, 2023. (X/No Tech For Apartheid)

“These systems aren’t engineered for life or death use in Middle East battlefields; they are trained on family photographs,” Anderson explained. “Bringing something like this to a warzone … It goes directly against our privacy policies and our use policies.” Indeed, Google’s privacy policies pledge that users must offer “explicit consent to share any sensitive personal information” with third parties. Under its Dangerous and Illegal Activities protocols, Google warns that Google Photos cannot be used “to promote activities, goods, services, or information that cause serious and immediate harm to people.”

Despite obvious breaches of its established policies, Google and other tech conglomerates have not prevented the Israeli army from using their products in the current war on Gaza or in the decades of Israel’s military rule over the occupied Palestinian territories. Many of these private companies profit from the exchange, as Palestinian civilians denied recourse to basic privacy protections offer up an unlimited supply of data with which surveillance firms can refine their products. “These companies are part of a vast algorithmic supply chain central to warfare today,” Matt Mahmoudi, a researcher at Amnesty International, told +972. “Yet they’ve failed to speak up.” 

As the list of Israeli abuses in Gaza grows, these companies may be legally implicated in Israel’s systemic violations of international law. “It’s a cautionary tale for any company,” Mahmoudi said. “Not only are they violating international human rights law, not only are they risking reputational damage, but they are risking being held guilty of aiding and abetting something that will surely be classified as a serious crime in due time.”

Charges of war crimes have not stopped Israeli military officials from promising all the bloodshed will yield unprecedented advancements in AI-powered warfare. Speaking at Tel Aviv University’s Annual AI Day in February, Brigadier General Yael Grossman, commander of the Lotem unit, told a crowd of civilian and military technology industry leaders that the army is continuing to roll out cutting edge systems. “Friction creates data,” she said. “It’s allowing us to grow much faster and be more scalable with the different solutions we provide to the battlefield.”

Such slogans have historically rallied Western governments and technology conglomerates around Israeli military prowess. But today, the tide may be turning. Western governments have begun considering withholding arms sales, and workers at Google and other major technology conglomerates are revolting against their employers’ contracts with the Israeli military. Amid Israel’s disregard for international regulations, Shtaya said this sea change may be the only hope in reigning in emerging weapons systems. 

“What’s going on in Palestine is not limited to [the Israeli military],” Shtaya explained. “The abuse of these systems is a global issue.”