MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
intelligence
Search

How AI-Based Military Intelligence Powered Israel's Attacks on Gaza

Saturday January 4, 2025. 10:43 PM , from Slashdot
How AI-Based Military Intelligence Powered Israel's Attacks on Gaza
It's 'what some experts consider the most advanced military AI initiative ever to be deployed,' reports the Washington Post.

But the Israeli military's AI-powered intelligence practices are also 'under scrutiny. Genocide charges against Israel brought to The Hague by South Africa question whether crucial decisions about bombing targets in Gaza were made by software, an investigation that could hasten a global debate about the role of AI technology in warfare.'

After the brutal Oct. 7, 2023, attack by Hamas, the Israel Defense Forces deluged Gaza with bombs, drawing on a database painstakingly compiled through the years that detailed home addresses, tunnels and other infrastructure critical to the militant group. But then the target bank ran low. To maintain the war's breakneck pace, the IDF turned to an elaborate artificial intelligence tool called Habsora — or 'the Gospel' — which could quickly generate hundreds of additional targets. The use of AI to rapidly refill IDF's target bank allowed the military to continue its campaign uninterrupted, according to two people familiar with the operation. It is an example of how the decade-long program to place advanced AI tools at the center of IDF's intelligence operations has contributed to the violence of Israel's 14-month war in Gaza... People familiar with the IDF's practices, including soldiers who have served in the war, say Israel's military has significantly expanded the number of acceptable civilian casualties from historic norms. Some argue this shift is enabled by automation, which has made it easier to speedily generate large quantities of targets, including of low-level militants who participated in the Oct. 7 attacks.
In a statement to The Post, the IDF argued that 'If anything, these tools have minimized collateral damage and raised the accuracy of the human-led process.'

The IDF requires an officer to sign off on any recommendations from its 'big data processing' systems, according to an intelligence official who spoke on the condition of anonymity because Israel does not release division leaders' names. The Gospel and other AI tools do not make decisions autonomously, the person added...Recommendations that survive vetting by an intelligence analyst are placed in the target bank by a senior officer...

Another machine learning tool, called Lavender, uses a percentage score to predict how likely a Palestinian is to be a member of a militant group, allowing the IDF to quickly generate a large volume of potential human targets... The rule mandating two pieces of human-derived intelligence to validate a prediction from Lavender was dropped to one at the outset of the war, according to two people familiar with the efforts. In some cases in the Gaza division, soldiers who were poorly trained in using the technology attacked human targets without corroborating Lavender's predictions at all, the soldier said.
The article includes an ominous quote from Steven Feldstein, a senior fellow at the Carnegie Endowment who researches the use of AI in war. Feldstein acknowledges questions of accuracy, but also notes the accelerated speed of the systems, and the ultimate higher death count. His conclusion?

'What's happening in Gaza is a forerunner of a broader shift in how war is being fought.'

Read more of this story at Slashdot.
https://tech.slashdot.org/story/25/01/04/2141224/how-ai-based-military-intelligence-powered-israels-...

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2025 Zicos / 440Network
Current Date
Jan, Tue 7 - 00:36 CET