Aside from the “war is the continuation of politics by other means” idea, the most famous concept from On War is friction. The basic idea is simple: things go wrong. In other words, never depend on everything going according to plan. You'd think people would know better, but...
Luttwak has a nice, everyday example in Strategy: The Logic of War and Peace. Suppose you and your neighbors decide on the spur of the moment to have a beach party. Of course, the beach is only a half hour’s drive away. Factoring in another half an hour for packing the car with blankets, food, sunscreen, and other necessities, that means you can all meet at the beach in an hour, right?
I think we all know how this story turns out. One family is low on gas, so they’re delayed fifteen minutes. Another family is new to the neighborhood, so when they take a wrong turn, it takes them 45 minutes to get back on course. The family responsible for bringing the cooler can’t find it in the garage, so they have to stop to buy a new one. The party really doesn’t start until an hour or two after the planned time, despite everyone’s best intentions.
The same principle applies in war, only more intensively, since there's an enemy working hand in hand with fate to undo your plans. In our beach party example, no one is actively trying to prevent the families from reaching the beach. No one has tampered with the street signs, let air out of their tires, or blocked the streets with debris. Our hypothetical beach party is like an engineering problem—just one that isn’t very well project managed. When you throw in a surly neighbor doing everything he can to stop the beach party from happening, the number of things that can go wrong skyrockets. (Maybe that’s another good reason to be nice to your neighbors.)
In war, friction happens all the time. In 1862, someone in the Confederate army wrapped a copy of the marching orders around some cigars, then accidentally dropped it where a Union soldier later found it. A storm smashed the Mongol fleet that was poised for an invasion of the Japanese islands. There are plenty of such examples, at every level of strategy, of the role that blind fate played in the history of warfare. A responsible commander tries to plan for the unexpected. A clever commander actually figures out ways to prevent accidents, or at least minimize the damage. Not even the most brilliant commander can ever plan for every contingency.
Obviously, friction applies to any human enterprise. Speaking as a veteran of software releases, I only wish people in my industry sometimes had a better appreciation for this principle. Inevitably, release dates slip, but no one ever builds in some padding in the schedule the next time around. Quite the contrary: the trend has been to squeeze releases further together, giving friction an even bigger chance to wreak havoc with deadlines.
Friction can help make better sense of failures. The natural tendency when something goes wrong is to point fingers. Who’s the idiot who made the mistake? Usually, there’s no convenient idiot; systemic failure is often the culprit. During the Cold War, many people concerned about strategic surprise—particularly the risk of a surprise Soviet nuclear attack—were focused on these kinds of systemic failures. The Pearl Harbor disaster didn’t happen from a single person’s catastrophic mistake. Many pebbles rolling down the hill contributed to the avalanche: It was clear that the Japanese were planning an attack, but where, when and how were the question. The commanders at Pearl Harbor, fearing sabotage by Japanese infiltrators, moved aircraft closer together, to make them easier to guard—but also making them easier targets en masse for Japanese bombers. A Soviet agent in Japan passed along detailed intelligence about the planned attack, but it was small strand amidst a tangle of information that American authorities were in the midst of evaluating. For some sharp analyses of the systemic failures behind Pearl Harbor and other great catastrophes, I recommend Gooch and Cohen’s Military Misfortunes or Allison’s Essence of Decision for details.
Friction is almost axiomatic in Clausewitz, so it shouldn’t be too surprising that it has some useful corollaries:
- Keep your strategy as simple as possible. The more moving parts in the strategy, the greater the risk that the whole strategy comes to a halt when one of these cogs slips. The Japanese attack on Midway, for example, had too many task forces moving on a precise schedule to expect all of them to arrive on time.
- Never have a strategy predicated on a chain of if’s. This is a slightly different version of the previous corollary. Sometimes, strategic thinking depends too much on a particular chain of events occurring. Back in the 1980s, when Enver Hoxa, the communist leader of Albania, died, an overheated analysis in The Atlantic Monthly argued that this one small event (no offense intended to Albanians) might lead us to the brink of World War III. If the Soviets saw their prestige at stake, if another independent Albanian took power; if they had to move their ground forces through Yugoslavia to enforce their will on Albania; if the Yugoslavians decided to halt the Soviet columns; if, if, if. Anyone who has ever drawn a decision tree knows the fallacy of this kind of thinking. Unfortunately, in policy circles, where people do worry a great deal about being blamed for catastrophic failures, analysis can get a little unhinged. (Fortunately, the US government didn’t see Hoxa's death quite the same ways as The Atlantic Monthly did.)
- Never have a plan that doesn’t permit adjustments. You have to build some flexibility into the plan. During Lee’s invasion of the North in 1863, he knew the Union army would eventually have to fight the Confederates. Lee wanted to make sure that it was the right time and place, but he couldn’t tell where. Instead, he knew had had to feel his way through the operational level of his strategy—which, of course, depended on good intelligence from his cavalry about the Union army’s movements. When J.E.B. Stuart took the Confederate cavalry on another raid around the Union rear, he robbed Lee temporarily of critical information.
- Have more resources than you think you need. Troops, fuel, ammunition—you’ll never feel as though you have enough, so put your hands on more than you think you need. If you play too closely to the margin of expected resources, you’ll be stuck when it turns out you unexpectedly need more. The famous three-to-one rule (if you’re attacking, always outnumber your enemy by at least three to one) is, to a large degree, inspired by hard lessons on this subject.
- Always have an exit strategy. Needless to say, not only might you need to make adjustments to a strategy, but you may have to throw it out altogether. The term exit strategy, of course, is an instant reminder of the Vietnam quagmire. A common mistake early in the war wasn’t figuring out how to apply the strategy more vigorously, or against a different target (this bridge or that factory in North Vietnam?). The real mistake was not having an escape hatch if strategies like Operation ROLLING THUNDER were based on false premises.
One word of caution: you can easily take the fear of the unexpected too far. As I mentioned earlier, you can’t plan for every contingency. You can’t wait until every piece is in place. “The slows”—Lincoln’s derisive term for McClellan’s unwillingness to begin an attack until every soldier and supply wagon was in the proper place—can be as dangerous as recklessness. The ablest commanders are sometimes the people who make the best guesses at how to balance preparation and initiative.
The Iraq invasion, at the theater level of strategy, violated these rules altogether. Operationally, tactically, technically, the invasion was initially a great success, a quick occupation against far less resistance than many expected. (However, segments of the Iraqi army had already planned to melt away into the population to fight the Coalition during the occupation, not the invasion. Shock and awe, therefore, didn’t exactly work as advertised.)
Tragically, the theater level strategy—invade Iraq, find and eliminate the weapons of mass destruction the Baathists were concealing, install some new leaders to replace the “decapitated” Baath party, shake the hands of a grateful populace, and leave—didn’t work. Its failure was no big surprise to anyone familiar at the most basic level with the concept of friction. Too large a chain of ifs. No flexibility built into the occupation plan… No exit strategy…You can’t blame some inevitable level of bad luck, missed opportunities, or unexpected turns of events. These were deliberate decisions, at the theater level, to ignore friction altogether.
Anyone reading this post already knows, in literally gory detail, what hasn’t gone according to plan in Iraq. I won’t repeat the details here.