War by algorithm raises new moral dangers

Stay informed with free updates

One of our worst nightmares about artificial intelligence is that it will enable killer robots to stalk the battlefield dispensing algorithmically-determined death and destruction. But the real world is a lot messier than the comic books. As Israel’s bombardment of Gaza shows, we may be moving towards more invisible and insidious forms of automated decision-making in warfare.

A chilling report published last week by the Israeli online magazine +972 highlighted the heavy reliance of the Israel Defence Forces early in the war on an AI-enabled mass target-generation system known as Lavender, which flagged 37,000 Gazans as suspected Hamas militants. As a result, many were bombed in their homes, often killing their families too.

Disaffected Israeli intelligence sources, interviewed by +972, said the system had an estimated error rate of 10 per cent, wrongly identifying some targets for assassination. They also alleged that the IDF would permit a strike on a junior Hamas militant, even at the risk of killing 15 to 20 civilians. That ratio rose from 1 to more than 100 for a senior Hamas commander. Human oversight of the automated target-identification process was minimal, sometimes lasting no more than 20 seconds, they claimed. “In practice, the principle of proportionality did not exist,” one told the magazine.

The IDF has contested several aspects of this report. “Contrary to Hamas, the IDF is committed to international laws and acts accordingly,” the IDF said in a statement. The “system” was a database that human analysts used to verify identified targets. “For each target, IDF procedures require conducting an individual assessment of the anticipated military advantage and collateral damage expected.”

What is indisputable is the horrifying loss of civilian life in Gaza that occurred in the first weeks of the war following the murderous Hamas attack on Israel on October 7. According to Palestinian authorities, 14,800 people, including about 6,000 children and 4,000 women, were killed in Gaza before the temporary ceasefire of November 24 when Lavender was most used. 

Many aspects of the Israeli-Gazan tragedy are unique, born of the region’s tangled history, demography and geography. But Israel is also one of the world’s most technologically advanced nations and the way it wages war feeds the global debate about the military uses of AI. That debate pits so-called realists against moral absolutists.

Realists argue that AI is a dual-use technology that can be deployed in myriad ways for both good and bad. Few, for example, would contest its use in defensive weapons, such as Israel’s Iron Dome that has intercepted multiple rocket attacks from Gaza. But the Lavender system appears to have contributed to  a “dramatically excessive” rate of collateral damage that is morally indefensible, says Tom Simpson, a former Royal Marine now philosophy professor at the University of Oxford. 

Yet Simpson opposes an outright ban of lethal autonomous weapons (LAWS), as some campaigners are demanding, given the likelihood that less law-abiding hostile countries will still use them. Implementing any such ban would create “an appalling strategic vulnerability”. “If liberal democracies are worth defending they should have the most effective tools,” he tells me.

Robots, like soldiers, are often instructed to do the dull, dirty and dangerous work. “If we can spend treasure to save blood we should do it.”

But the moral absolutists draw a clear red line at outsourcing any life-and-death responsibilities to error-prone machines. The experience of the Lavender system shows the tendency to over-trust the computer. It also highlights the difficulty of keeping “humans in the loop” in the heat of war. 

Mary Ellen O’Connell, a law professor at the University of Notre Dame who supports a ban on LAWS, says that realists tend to favour the projection of power over the protection of the rule of law. But, she tells me, for western democracies effective power rests on the rule of law. “The US has only won one war since World War II and that was the liberation of Kuwait. That was the only war we have fought lawfully in full compliance with the United Nations charter,” she says.

That is also a realist argument that should resonate in Israel, as much as the US. Soldiers should believe they are fighting for a just cause in a moral way. Those who spoke to +972 magazine clearly question the way that Israel has fought the war. Humanity must lead the technology, rather than the other way around.

john.thornhill@ft.com

Via

Leave a Comment

AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk AcUk