The defense industry loves a good ghost story about "surgical" strikes. They sell a narrative where high-altitude hardware can thread a needle from space, hitting a specific chair in a specific room while leaving the coffee on the table undisturbed. Then Minab happens. Then a classroom becomes a target. And the industry immediately retreats into the same tired script: it was a technical glitch, a "one-in-a-million" fluke, or a failure of human intelligence.
They are lying. Or worse, they actually believe their own marketing.
The tragedy in Minab isn't an outlier. It is the inevitable logical conclusion of our obsession with mathematical certainty in a chaotic physical world. We have reached a point of diminishing returns where adding more sensors and more processing power doesn't make a weapon safer—it just makes its failures more unpredictable.
The Fallacy of the Zero Error Margin
Every missile has a Circular Error Probable (CEP). This is the radius of a circle within which 50% of the rounds are expected to land. When a salesperson pitches a "precise" system, they show you the 50%. They never talk about the other 50%.
In the world of munitions, precision is a statistical distribution, not a promise. When you hear that a missile has a "one-meter accuracy," that does not mean it will always hit within one meter. It means that under controlled test conditions, half of them did. The public interprets "precision" as a binary—it either hits or it misses. In reality, precision is a bell curve with very long, very ugly tails.
In Minab, the world saw what happens when a strike lands on the tail of that curve.
The Software Overload Problem
Modern munitions are flying computers. They run millions of lines of code. They rely on GPS, inertial navigation systems (INS), and often semi-active laser or infrared seekers.
I have seen engineering teams spend three years trying to squash a bug that only appears when a specific humidity level interacts with a specific vibration frequency during the terminal phase of flight. You cannot test for every permutation of reality.
When you "over-engineer" for precision, you create a brittle system. A simple 1960s gravity bomb is predictable; it goes where gravity and momentum take it. A "smart" missile is a black box. If the onboard Kalman filter—the mathematical algorithm used to estimate the state of a moving system—receives one piece of corrupted data from a jammed GPS satellite, the "precise" missile might decide the classroom next door is actually the intended coordinates.
The more complex the system, the more ways it can fail. We are trading "dumb" misses for "smart" catastrophes.
Intelligence is Not Data
The biggest lie in the defense space is that better data equals better outcomes.
The Minab incident is frequently blamed on "bad intel." This is a convenient scapegoat because it shifts the blame from the expensive hardware to the fallible humans on the ground. But the problem isn't the quality of the data; it’s the speed of the kill chain.
We have "compressed" the time between identifying a target and pulling the trigger. We call this efficiency. In reality, we’ve just removed the time required for skepticism. When a sensor identifies a "high-value target," the system is designed to facilitate the strike as fast as possible.
We have automated the process of being wrong.
- Data is a set of coordinates.
- Intelligence is knowing that those coordinates belong to a school during operating hours.
When we prioritize the "precision" of the kinetic strike over the "accuracy" of the context, we get Minab. Every single time.
The "Collateral Damage" Euphemism
Stop using the term "collateral damage." It’s a linguistic shield used to sanitize the fact that the math failed.
If a missile is marketed as being able to hit a specific window, then hitting the building next door isn't collateral damage—it is a total system failure. The industry wants to have it both ways: they want the prestige of "unmatched precision" when things go right, and the "unpredictability of war" excuse when things go wrong.
If a self-driving car kills a pedestrian, we don't call it "collateral transport." We call it a fatality and investigate the sensors. Why does the defense industry get a pass?
The Cost of the "Clean War" Narrative
The obsession with precision has created a dangerous political illusion: the "Clean War."
Because we believe our weapons are precise, leaders are more willing to use them. If you think you can take out a target with zero side effects, the barrier to entry for kinetic action drops to near zero.
This is the "Precision Paradox." The more precise we think our weapons are, the more frequently we use them, and therefore, the more "accidental" tragedies like Minab we guarantee. If we admitted that our weapons were blunt instruments, we would be much more hesitant to swing them in crowded areas.
The High Price of the Wrong Solution
We are spending billions to shrink the CEP from three meters to one meter.
Why?
Does a one-meter CEP actually change the strategic outcome of a conflict? Rarely. But it does provide a massive profit margin for contractors who can claim they’ve solved the "unsolvable" problem of accuracy.
I have watched programs burn through nine-figure budgets trying to account for "windage" in the final three seconds of flight, while the actual problem was that the person designating the target was looking at the wrong map. We are solving the hard math problems because they are profitable, while ignoring the hard human problems because they are "messy."
Stop Trusting the Screen
If you are a decision-maker, a commander, or an analyst, you need to internalize one truth: The crosshairs on your screen are a suggestion, not a fact.
The "precision" you are being sold is a laboratory result. It does not account for electronic warfare, signal multipath, sensor degradation, or the simple fact that hardware breaks in the heat.
The Minab classroom wasn't a "mistake" by the missile. The missile did exactly what its corrupted logic told it to do. The mistake was believing that a piece of silicon and a thermal seeker could ever be "precise" enough to justify the risk of firing into a civilian center.
If you want to avoid another Minab, stop trying to make the missiles smarter. Start making the people who use them more cynical.
Dismantle the pedestal we've put "precision" on. It’s not a shield; it’s a blindfold.
Check the coordinates again. Then check who told you they were correct. Then assume the missile is going to miss by fifty meters and ask yourself if you’re still willing to pull the trigger.
If the answer is no, then the "precision" didn't matter in the first place.