A few weeks ago, I looked at the IDF’s Decisive Victory concept of operations. I assessed then that a large part of it was the targeting process eating tactics.
This is just the targeting process eating tactics, and any fight is about a lot more than targeting. It rests on a number of assumptions that are unlikely to be true: 1) the targeting data from intelligence sources is accurate; 2) the targeting data from intelligence sources is complete; and 3) high-level commanders in the rear surrounded by banks of monitors are the best informed. Certainly if all three of these assumptions are true, then the rest really is just in the hands of commanders given the authority to strike targets. But when are those three things ever true?
Now we’ve seen the concept in action, at least in part, and there are more problems even than the above assumptions, all of which proved false.
Last week +972 Magazine, a publication in Israel, published an in-depth examination of the IDF’s targeting campaign based on IDF sources. It’s a very long read, but chock full of details on how the fires component of the Decisive Victory concept works.
One former intelligence officer explained that the Habsora system enables the army to run a “mass assassination factory,” in which the “emphasis is on quantity and not on quality.” A human eye “will go over the targets before each attack, but it need not spend a lot of time on them.” Since Israel estimates that there are approximately 30,000 Hamas members in Gaza, and they are all marked for death, the number of potential targets is enormous.
In the American targeting system, two processes (among others) happen simultaneously. One is technical fire direction: what system can hit the target, with the right munition, at the right time. Technical fire direction is the science part. The other is tactical fire direction: should this target be hit. It’s the art: does it make sense to hit this target? Can it be hit without collateral damage? Should we expend this ammunition based on our stocks or hold it for a more valuable target? Does this support the scheme of maneuver? It’s a mix of tactics and strategy, driven by the front-line commanders with the best feel for the fight. It was also the majority of my job for about a decade as an artillery officer.
The article clearly describes a system that has abandoned tactical fire direction and only does technical fire direction. Worse, it’s not run by the supported maneuver commanders or the fire support specialists among the firing units who have the right situational awareness. It’s run by intelligence officers far from the front line, just as I feared based on the Decisive Victory concept.
Even more concerning is that they’ve largely taken human judgment out of the process in favor of an automated system of target nomination.
According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”
According to the sources, the increasing use of AI-based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale, even those who are junior Hamas operatives. Yet testimonies of Palestinians in Gaza suggest that since October 7, the army has also attacked many private residences where there was no known or apparent member of Hamas or any other militant group residing. Such strikes, sources confirmed to +972 and Local Call, can knowingly kill entire families in the process.
The point of Clausewitz’s description of war as a political phenomenon is not just about the political objectives of each side, but that it is a social phenomenon. It is about human interaction. It’s not just about attrition, but communication. The IDF has built a system that fails to interact with the human dimension of the conflict, reducing it to a mechanical process of servicing targets. This is what he called “fire combat,” which he believed could never be decisive. This is the worst of attrition style warfare: assuming that destroyed targets add up to tactical and strategic advantage in a linear process of pure math. But war is always nonlinear, always about more than just numbers.
It’s not just ineffective. It’s not just destroying international support for Israel (an effect happening on the moral level of war in John Boyd’s description), it’s also exactly what Hamas wants.
However, the source continued, “it didn’t work. As someone who has followed Hamas, I heard firsthand how much they did not care about the civilians and the buildings that were taken down. Sometimes the army found something in a high-rise building that was related to Hamas, but it was also possible to hit that specific target with more accurate weaponry. The bottom line is that they knocked down a high-rise for the sake of knocking down a high-rise.”
This campaign is designed on the same logic as strategic bombing: hurt the population, and they will pressure their leaders to end the war. It never works. It especially won’t work against a Palestinian population, about half of whom are children, held hostage by Hamas, An organization who, again, wants this to happen in order to strengthen their own cause.
The other issue here is precision-guided munitions. Israel is using them to make sure they hit what they’re trying to hit, but not taking advantage of their potential to minimize counterproductive civilian casualties. Precision isn’t accuracy.
It does not bode well for the IDF that they have removed tactics and strategy from the fight and replaced it with a machine. It only bodes well for Hamas. Hopefully we can learn from this that removing the human factor from warfare only leads to inhumanity.
The habsora system (the use of AI to generate targets) is fucking terrifying