Russia’s invasion of Ukraine and nuclear saber rattling against the West have revived a debate about nuclear weapons. Last year, when a United Nations treaty to ban such weapons outright entered into force, none of the world’s nine nuclear-weapons states was among the 86 signatories. How can these states justify possessing weapons that put all of humanity at risk?
That is a pertinent question, but it must be considered alongside another one: If the United States were to sign the treaty and destroy its own arsenal, would it still be able to deter further Russian aggression in Europe? If the answer is no, one also must consider whether nuclear war is inevitable.
It’s not a new question. In 1960, the British scientist and novelist C.P. Snow concluded that nuclear war within a decade was “a mathematical certainty.” That may have been an exaggeration, but many believed Snow’s prediction would be justified if a war occurred within a century. In the 1980s, Nuclear Freeze campaigners like Helen Caldicott echoed Snow in warning that the buildup of nuclear weapons “will make nuclear war a mathematical certainty.”
Those advocating the abolition of nuclear weapons often note that if you flip a coin once, the chance of getting heads is 50%; but if you flip it ten times, the chance of getting heads at least once rises to 99.9%. A 1% chance of nuclear war in the next 40 years becomes 99% after 8,000 years. Sooner or later, the odds will turn against us. Even if we cut the risks by half every year, we can never get to zero.
But the coin-flip metaphor is misleading where nuclear weapons are concerned, because it assumes independent probabilities, whereas human interactions are more like loaded dice. What happens on one flip can change the odds on the next flip. There was a lower probability of nuclear war in 1963, just after the Cuban Missile Crisis, precisely because there had been a higher probability in 1962. The simple form of the law of averages does not necessarily apply to complex human interactions. In principle, the right human choices can reduce probabilities.
The likelihood of nuclear war rests on both independent and interdependent probabilities. A purely accidental war might fit the model of the coin flip, but such wars are rare, and any accidents might turn out to be limited. Moreover, if an accidental conflict remains limited, it may trigger future actions that would further limit the probability of a larger war. And the longer the period, the greater the chance that things may have changed. In 8,000 years, humans may have much more pressing concerns than nuclear war.
We simply do not know what the interdependent probabilities are. But if we base our analysis on post-World War II history, we can assume that the annual probability is not in the higher range of the distribution.
During the Cuban Missile Crisis, US President John F. Kennedy reportedly estimated the probability of nuclear war to be between 33% and 50%. But this did not necessarily mean unlimited nuclear war. In interviews with participants in that episode on its 25th anniversary, we learned that, despite the massive superiority of the US nuclear arsenal, Kennedy was deterred by even the slightest prospect of nuclear war. And the outcome was hardly an unalloyed American victory; it involved a compromise that included the quiet removal of US missiles from Turkey.
Some people have used the mathematical-inevitability argument to push for unilateral nuclear disarmament. Inverting the Cold War slogan, future generations would be better off red than dead. But nuclear knowledge cannot be abolished, and coordinating abolition among nine or more ideologically diverse nuclear-weapon states would be extremely difficult, to say the least. Unreciprocated unilateral steps could embolden aggressors, increasing the odds of an unhappy endgame.
We have no idea what utility and risk acceptance will mean to distant future generations, or what people will value in 8,000 years. While our moral obligation to them compels us to treat survival very carefully, that task does not require the complete absence of risk. We owe future generations roughly equal access to important values, and that includes equal chances of survival. That is different from trying to aggregate the interests of centuries of unknown people into some unknowable sum in the present. Risk will always be an unavoidable component of human life.
Nuclear deterrence is based on a usability paradox. If the weapons are totally unusable, they do not deter. But if they are too usable, nuclear war with all its devastation might occur. Given the usability paradox and the interdependent probabilities related to human interactions, we cannot seek an absolute answer to what constitutes “just deterrence.” Nuclear deterrence is not all right or all wrong. Our acceptance of deterrence must be conditional.
The just war tradition that we have inherited over the centuries suggests three relevant conditions that must be met: a just and proportionate cause, limits on means, and prudent consideration of all consequences. I derive five nuclear maxims from these conditions. In terms of motives, we must understand that self-defense is a just but limited cause. As for means, we must never treat nuclear weapons as normal weapons, and we must minimize harm to innocent people. And regarding consequences, we should reduce the risks of nuclear war in the near term and try to reduce our reliance on nuclear weapons over time. A bomb in the basement involves some risk, but not as much risk as bombs on the front lines.
The war in Ukraine has reminded us that there is no way to avoid uncertainty and risk. The goal of reducing (not abolishing) the role of nuclear weapons over time remains as important as ever. Richard Garwin, the designer of the first hydrogen bomb, calculated that, “If the probability of nuclear war this year is 1%, and if each year we manage to reduce it to only 80% of what it was the previous year, then the cumulative probability of nuclear war for all time will be 5%.” We can live moral lives with that probability.
Copyright: Project Syndicate, 2022.