Normalizing Nukes Was Our First Mistake
Re Iran, remember that strategic logic propels any country facing nuclear-armed rivals to consider a nuclear deterrent of its own. As Joe Strummer once said, "it's all of us or none of us, mate"
The New York Times last week offered a timely reminder of how and when America’s mainstream thinks about the consequences of nuclear warfare. It was referencing the film The Day After, “aired on ABC in 1983 and (which) was watched by more than 100 million people, about 67 percent of the American viewing public that evening. The film, shot in Lawrence, Kan., depicts the very real-feeling aftermath of a nuclear attack.” And it was timely, I suppose, because the tiresome topic of Iran and nuclear weapons is once again in the news.
Simply showing Americans in a calm and rational manner what their own towns and cities would have lived through in the event of an exchange of nuclear strikes with the Soviet Union had a profound effect on their thinking in the mid-1980s. The most notable epiphany prompted by that film was that of then-President Ronald Reagan: He had been a Cold War nuclear hawk whose rhetoric and policy moves ramped up the possibility of a nuclear confrontation with what he dubbed “The Evil Empire”; now, having seen what nuclear war would mean in the American heartland, he wrote that he was “depressed”. More significantly, he acknowledged that its apocalyptic vision had turned him – to the alarm of the Cold Warriors around him – into a nuclear dove who prioritized arms-reduction agreements with his Soviet counterpart Mikhail Gorbachev.
The Day After, of course, wasn’t the first film attempting to warn a NATO public about what they would live through (or probably not) in a nuclear war via a vivid dramatic enactment: The most memorable, for me, was Peter Watkins’ 1965 Oscar-winning docudrama “The War Game”, an even more harrowing portrayal of the effects of a limited Soviet nuclear strike on a mid-size British city. It had been made the for the BBC, which then declined to air it – sounds familiar, eh? But it had extensive festival showing winning critical acclaim, and became an indispensable organizing tool for the Campaign for Nuclear Disarmament which built the film’s cult status. (It was only 20 years later that the film was finally shown to the British TV public.)
But Oscar-winning documentaries were hardly public fare in the mid 1960s, particularly those that challenged Cold War orthodoxy. That took “The Day After.”
Of course, the fact that it took a fictional representation of the effect of a nuclear attack on a Midwestern city to enlighten the U.S. public on the effects of detonating a nuclear weapon over a civilian population center is, also, as an indictment of the devastatingly effective suppression of a public reckoning with the horror the United States had unleashed Hiroshima and Nagasaki in August 1945.
If Americans had been forced to confront the reality of what their atomic weapons had done to those cities – killing more than 200,000 mostly civilians, and many, many thousands more over the years as a result of the sickness created by what even a number of top U.S. generals feared had been a war crime — they may not have needed a fictional tale of the nuking of Lawrence, Ka. to alert them to the horrors they should expect in a nuclear confrontation with an enemy capable of hitting back in kind.
Starting immediately after the bombing of Hiroshima and Nagasaki, the U.S. authorities had aggressively suppressed attempts to publicly air images of the horrific impact of America’s nuclear weapons, as historian Greg Mitchell has documented. And that effort to turn America’s eyes away from what its leaders had done in 1945 was still in force 50 years later.
As I’ve written previously on this platform,
I was gobsmacked in 1995, as a recent arrival in the United States, by the bipartisan congressional effort that blocked a planned Smithsonian Institution exhibit on the 50th anniversary, which was branded “anti-American propaganda” because it intended to include images of the damage wrought by the Hiroshima bomb. House Speaker Newt Gingrich, who led that effort, later called it “a fight, in effect, over the reassertion by most Americans that they are sick and tired of being told by some cultural elite that they ought to be ashamed of their country.”
Sound familiar? Of course it does. The evasion of any moral or legal reckoning with Hiroshima and Nagasaki is very much of a part with more recent efforts to suppress a similar reckoning with slavery and American racism. But let’s be clear, the denial of U.S. war crimes abroad is very much a bipartisan business: The U.S. Senate unanimously passed a resolution condemning the Smithsonian plan, declaring that the bombing of Hiroshima had helped “bring the war to a merciful end”, thereby “saving the lives of Americans and Japanese”. (Needless to say, the U.S. Senate would not care to canvass Japanese public opinion on this claim.)…
Western discourse recuses itself from even discussing Western violence against civilian populations elsewhere by falling back on its moral certitudes: Eggs are broken in the course of making omelettes, after all, and the assumption of virtue in Western military interventions abroad puts it beyond reproach, reducing the deaths of the uncounted people of color — whether in Japan or Viet Nam, Iraq or Afghanistan — to “collateral damage” inflicted in pursuit of a higher purpose.
In my 30 years in the United States, I’ve never heard a serious discussion among the nation’s leaders about under what circumstances, and against which targets, the U.S. might again mount nuclear attacks. Until the U.S. reckons with the question of whether and how war crimes were perpetrated in Hiroshima and Nagasaki, we’re in a dangerous zone of denial. Truman’s fateful decision to unleash nuclear terror in 1945, and the lack of national introspection over that decision ever since makes it all too easy to imagine a repeat. One piece of hand luggage that never left President Obama’s side during on his May 2016 visit to Hiroshima, was the U.S. nuclear “football” that would allow him to unleash 22,000 Hiroshimas.
By normalizing nuclear weapons, the U.S. triggered a cascade of nuclearization that has steadily increased the danger of further Hiroshimas, and and further Days After. The Soviets had by 1949 developed their own nuclear weapons to deter the U.S. from delivering a Hiroshima-type strike against them; the desire for strategic autonomy from the U.S. prompted the UK (1952) and France (1960) to follow suit, while the same desire by the Chinese Communist Party leadership to free itself from any last vestiges of Moscow’s tutelage produced a Chinese bomb in 1964.
There were five nuclear-armed states by the time the Nuclear Non-Proliferation Treaty went into force in 1970, establishing enduring mechanisms for monitoring civilian nuclear programs to ensure no weaponization could take place, on the stated premise that a freeze on further weapons proliferation would give the existing five the space to disarm, as required by the treaty. Yeah, exactly… Good luck with that.
Israel declined to sign the NPT, because it is believed to have developed its own nuclear arsenal starting around 1967 – with help from France. And Israel helped its sibling regime, apartheid South Africa, develop its own nuclear weapons by the late 1970s.
China’s nuclear capability had prompted its arch-rival India to demonstrate the same capacity in 1974, and it was no surprise when Pakistan later followed suit. Then came North Korea, like China before it, not trusting its regime-survival to erstwhile communist allies, developing its own nuclear trump card in the early 1990s.
Iraq had developed a secret nuclear weapons program in the late 1980s fearing being overrun by the massed infantry of Iran in the course of the brutal eight-year war Saddam Hussein had launched, though it never managed to assemble a weapon. And Iraq’s efforts had prompted its arch enemy, Iran, to begin researching nuclear weapons development.
The pursuit of nuclear weapons has cascaded through a lattice-work of regional strategic rivalries, precisely because they’re a short-cut equalizer of conventional military imbalances. That makes them the ultimate guarantor of a regime’s survival.
South Africa’s apartheid regime dismantled its nukes under IAEA supervision in 1994, but that was a result of it having agreed to cede power. And, of course, the nuclear deterrent didn’t prevent the Soviet Union from collapsing, but that collapse was internally driven, not externally imposed — and the nuclear deterrent was inherited by its Russian successor state. (Ukraine clearly has good reason to lament its decision to transfer the Soviet nukes that had been stationed on its soil to Russia in the mid ‘90s.)
So, why should it surprise anyone that Iran might consider the nuclear option? It has two nuclear armed historic rivals on its borders (Russia and Pakistan, and also nuke-coveting Iraq until 2003) and is confronted by two belligerent nuclear powers with long-term ambitions to topple its regime (the United States and Israel). And as I noted a few years ago
Iran’s nuclear activities fit the pattern of post-Hiroshima global statecraft: Nuclear weapons have never been an end in themselves; instead they provide the ultimate deterrent. US politicians from Trump to Hillary Clinton casually threaten to “obliterate” Iran, a nod to US nuclear capability. Iran knows that no power can seriously contemplate an existential attack on a regime capable of responding in kind.
The attraction of a nuclear deterrent for any regime with more powerful enemies is obvious. “The Iranians had good reason to acquire nuclear weapons long before the present crisis, and there is substantial evidence they were doing just that in the early 2000s,” realist US foreign policy scholar John Mearsheimer wrote recently in The New York Times. “The case for going nuclear is much more compelling today. After all, Iran now faces an existential threat from the United States, and a nuclear arsenal will go a long way toward eliminating it”…
And so Iran achieved a diplomatic innovation: It never actually began to build a nuclear weapon, but it demonstrated sufficient proof of its ability to do that it was able to accrue many of the gains that other regimes had won only once they had built and tested atomic bombs. Iran’s capacity to produce bomb materiel compelled the key international powers to recognize a regime that many would have preferred to shun.
Iran didn’t start the nuclear game, and its conservative (as in status quo-survival) rationale for playing the game is unsurprising. But it’s an increasingly dangerous game, particularly because restraint requires adherence to a set of established, but mostly uncodified norms. And we’re living in an era when so many established geopolitical norms have been trashed by those who have the power to do so.
Far from imposing an escalatory ceiling on regional conflicts as they did at the height of the Cold War, recent events in India/Pakistan and Russia/Ukraine should be sounding alarms. The world has normalized nukes, and failed to resolve most of the geopolitical rivalry that risks their use.
Rather than pursue arms reduction and denuclearization, the U.S. is once again floating a one-time Reagan fantasy redubbed “a Golden Dome” that will prevent Lawrence, Ka. from every living through the scenario depicted in The Day After. But a Golden Dome could be a high-tech Maginot Line (look it up!) in an era where the travel options for weapons of great destructive power are expanding as quickly as technology. As long as nuclear weapons exist, our world is in great, great danger — danger of exponentially accelerating the sorts of extinction events that we’ve invited with climate change. The War Game, and the Day After, may be more relevant than we’d like to admit.
Superb piece.