Arguments

Should We Drop the War Metaphor?

Why we shouldn’t assume a war footing in addressing big societal problems, like COVID, for which we can’t define the terms of victory.

By Stuart Whatley Nicholas Agar

Tagged COVID-19LanguagepandemicWar

Although he abhorred war, Ernest Hemingway accepted that “once you are forced into it, for whatever reason it may be, you have to win it.” Left unspecified is what winning a war actually entails. We know when World War II was won because the victors still commemorate the occasion with Victory in Europe Day (May 8) and Victory Over Japan Day (September 2), marking the respective surrenders of the Axis powers. But there has been no such closure in our metaphorical wars—on terror, drugs, poverty, cancer—and there is growing evidence to suggest that our struggle against COVID-19 will follow a similar pattern.

Barring a formal declaration of war on the coronavirus, White House chief medical adviser Anthony Fauci suggests that, “You should think of this, in some respects, like a war. The enemy is the virus.” Viewing COVID-19 in these terms is certainly more responsible than dismissing it as a “little flu,” as Brazilian President Jair Bolsonaro has done. Yet the idea that we are at war with the virus raises difficult questions.

Owing to the emergence of new viral variants, the persistence of the virus in non-human hosts/reservoirs, and the time it will take to immunize the global population, many health experts have come to the conclusion that we will neither achieve global herd immunity nor eradicate the disease. Instead, it will eventually come to be dealt with much like the seasonal flu or the “common cold.” Though we have eradicated other viruses like smallpox through war-like mobilizations, we have not met with the same success against HIV/AIDS, influenza, or the coronaviruses, rhinoviruses, adenoviruses, and enteroviruses that cause colds.

If there will be no “Victory Over COVID Day,” where does that leave us? On the one hand, when a nation declares war, even a metaphorical one, its citizens are given to understand that they bear a commitment to a common cause, and that they may be called upon to make sacrifices on its behalf. In the face of a deadly contagious pathogen or catastrophic climate scenarios, this sense of solidarity and vigilance is urgently needed. On the other hand, our endless wars against terror, drugs, poverty, and cancer point to the pitfalls of assuming a war footing without first defining the terms of victory.

Consider the “War on Poverty,” which U.S. President Lyndon B. Johnson launched in 1964 with the acknowledgement that, “It will not be a short or easy struggle, no single weapon or strategy will suffice.” Nonetheless, he promised that “we shall not rest until that war is won.” Since poverty still exists in America, the war clearly has not been won. But has it been lost?

Like most policy questions in America, the answer depends on which party you ask. Marking the fiftieth anniversary of Johnson’s declaration in 2014, those on the right described the war as a failure. After “U.S. taxpayers have spent over $22 trillion on anti-poverty programs,” noted Rachel Sheffield and Robert Rector of the conservative Heritage Foundation, “progress against poverty, as measured by the U.S. Census Bureau, has been minimal, and in terms of President Johnson’s main goal of reducing the ‘causes’ rather than the mere ‘consequences’ of poverty, the War on Poverty has failed completely.”

But as scholars from the left-leaning Center for American Progress pointed out, “While many of the programs that emerged from this national commitment are now taken for granted, the nation would be unrecognizable to most Americans if they had never been enacted.” Thus, to label the effort a failure, “is to say that the creation of Medicare and Head Start, enactment of civil rights legislation, and investments in education that have enabled millions of students to go to college are a failure.”

Such differences of opinion are perhaps inevitable in a metaphorical war, especially when the very definition of victory is contested, and when even an established victory proves perishable. Saving one community—or even one generation—from poverty is not the same thing as ending poverty. Around the world, tens of millions of people were lifted out of poverty during the first two decades of the twenty-first century. But following the global recession brought on by the COVID-19 pandemic, the World Bank warned that many of these gains were thrown into reverse.

As Susan Sontag observes in Illness as Metaphor, “The military metaphor in medicine first came into wide use in the 1880s, with the identification of bacteria as agents of disease.” But particularly after the Allied victory in WWII—the West’s “finest hour”—war came to be the go-to mode of engagement for all big, societal problems, just as the success of the original moonshot has made that the go-to metaphor for ambitious technological projects.

The trouble with our metaphorical wars is that they seek definitive solutions to fluid problems. Such is the nature of what the sociologist Zygmunt Bauman called “liquid modernity.” The common denominator across modern life, he argued, is a deep-seated “fragility, temporariness, vulnerability and inclination to constant change.” Whereas those in the early twentieth century thought that “‘to be modern’ meant to chase ‘the final state of perfection,’ … now it means an infinity of improvement, with no ‘final state’ in sight and none desired.” In the wars on poverty, COVID-19, cancer, terror, drugs, and so forth, it is not just the enemy that is fluid and ever-changing. So, too, are we.

When confronting elusive, amorphous, or ever-evolving adversaries, our own attitudes are as important as the weapons or resources at our disposal. While new therapies and vaccines have an obvious and indispensable role to play in any return to normalcy after the COVID-19 pandemic, some of the heavy lifting will have to be done in our own heads. As typically happens when we place all of our hopes in a technological solution, our initial expectations are now being disappointed. The likely persistence of the virus, despite all of our cutting-edge interventions, means that we will need a new perspective to close the gap between what was hyped (“the cavalry is coming”) and what was actually delivered.

We have done this many times before. Through a process known as hedonic normalization, we naturally accommodate for the less-than-perfect aspects of our existence over time. Traffic jams and long commutes are features of modern life that we would eagerly wish away if we could. But they generally don’t stop us from going to work each morning. We tacitly accept these conditions in the same way that earlier generations accepted the scatological byproducts of horse-drawn transportation.

Hedonic normalization differs from the more familiar experience of hedonic adaptation. The latter describes how individuals respond to positive or negative events in their lives, such as lottery winnings or job losses. The idea is that eventually, one reverts to one’s personal hedonic norm or “set point.” Hedonic normalization occurs as entire populations adjust to less that optimal aspects of their shared existences. Reading the classicist Mary Beard, one learns how ancient Pompeiians got used to the horrid smells of human-refuse-strewn streets. Similarly, we today are likely to move on as the coronavirus persists; while it generates problematic new variants, we will generate increasingly effective antiviral prophylactics alongside natural and vaccine-induced immunities.

When the earliest novel coronavirus infected human beings with what we’ll call “COVID-100,000 BCE,” we can assume that it caused a great deal of misery, owing to the lack of natural immunity. The historical populations that first experienced this novel coronavirus had neither molecular biologists nor statistically trained epidemiologists to leave written records of its effects. As The Economist notes, the “Russian flu” pandemic that killed one million people and caused lasting nerve disorders in the 1890s is now thought to have been a coronaviral precursor to some of today’s “colds.”

With enough time, COVID-19 will presumably follow a similar pattern. The late conservative radio host Rush Limbaugh issued a dangerous falsehood when he said in February 2020 that “the coronavirus is the common cold, folks.” Yet there will come a time when that statement can be uttered without controversy. And thanks to the vaccine rollout and the forthcoming arrival of second-/third-generations vaccines and prophylactics, that time will come sooner than it otherwise would have.

But, again, normalization means that the war will have to end without a V-Day. If we were to remain at war indefinitely, we would implicitly be maintaining a state of exception in which the coronavirus commandeers attention from even more pressing problems. What we collectively prioritize is limited not only by resources, but also by attention.

Discussing climate change, the psychologist Elke U. Weber speaks of a “finite pool of worry.” We have only so much time and mental capacity with which to focus on issues that might warrant our attention. An entire society’s pool of worry will be much larger than what we can imagine as individuals; but it is still finite. And “unlike money or other material resources,” Weber points out, attention can be neither saved nor borrowed.

Precisely when to declare an end to the war should be a matter for public debate. Most likely, the question will be decided in hindsight. As in the War on Cancer, which turns 50 this year and shows no end in sight, the War on COVID-19 will have been won when it is no longer appropriate for us to view the disease with behavior-altering dread.

One way to think about metaphorical wars is to view the struggle through the lens of tragedy: that is, the idea that another person’s misfortune bears directly on one’s own fate. For Nietzsche, the best classical Greek tragedies were those in which the distinction between individual and collective experience collapses under a universal sense of existential tension. While each viewer would arrive at the theater as a unique individual, all would then be joined in a single “higher community” when they experienced the same raw emotions at the same time for the same reason.

The same feeling justifies and motivates metaphorical wars, and its gradual dissipation can be taken to mean that the war has been won. Hence, the war on COVID will have ended when news of someone dying from the virus starts to sound like an improbable misfortune afflicting some distant, unlucky soul: when individual and collective tragic experiences no longer feel coterminous, as is already the case with annual deaths from seasonal influenza (650,000) and traffic accidents (1.35 million).

By the same token, the lack of a shared tragic consciousness can prevent concerted action against unresolved threats, as in the case of climate change, which for many people has been all too easy to ignore. Missing so far in that fight is the popular WWII interrogative: “Don’t you know there’s a war on?”

But with the increasing frequency and severity of extreme weather events, climate change will acquire a more widespread, tragic character, and calls for a wartime mobilization will gain traction. The Climate War is one that we have been forced into, and therefore have to win. But like our other metaphorical wars, the terms of victory will be contested. Different countries will have vastly different experiences and means of adaptation, implying that the war will be much more harrowing for some than others. Many countries have now committed to reducing their carbon-dioxide emissions to net zero by 2050. But no one yet know precisely how this will be done.

That makes the Climate War a lot like the War on Cancer. Writing in the 1970s, Sontag noted that by signing of the National Cancer Act of 1971, U.S. President Richard Nixon wanted to match John F. Kennedy’s promise to put Americans on the moon with “the promise to ‘conquer’ cancer.” But the War on Cancer, she observed, was not geared toward controlling the “industrial economy that pollutes” our ecosystems and bodies. Its sole objective, rather, was to finance the discovery of “the cure,” a species-level outcome that has so far proven impossible even to define, let alone achieve.

Plenty of influential people would prefer to wage the Climate War mostly with unproven technologies, so as to preserve today’s consumption patterns and material standards of living. Some of the imagined techno-fix scenarios would offer a more decisive victory than others. Using geoengineering to deflect solar radiation would merely paint over the problem, since it would not do anything to reduce the amount of greenhouse gases in the atmosphere or the acidification of the oceans. By contrast, a major breakthrough in direct-air capture (DAC) technology would allow for carbon-dioxide to be sucked out of the atmosphere, and thus for the burning of fossil fuels to continue unabated.

In this case, DAC would function much like COVID-19 vaccines did in countries that failed to stanch the spread of the virus. Had the virus been much deadlier, and had vaccines not arrived so quickly, the story would have been more tragic indeed. While technological breakthroughs absolutely will be necessary for mitigating and adapting to climate change, we should consider what our heavy reliance on these just-in-time solutions says about us as a “higher community.”

The most meaningful struggles, wrote Ursula Le Guin in her own critique of metaphorical wars, are the ones that must be pursued within ourselves. These require “not a war but a search and a discovery,” culminating not with “the end of a battle but the beginning of a life.” Victory is always welcome, but it does not always bring glory. How we confront climate change matters not just for our economic survival but also for our ethical development and moral legacy.

Read more about COVID-19LanguagepandemicWar

Stuart Whatley is Senior Editor at Project Syndicate. You can follow him on Twitter @StuartWhatley.

Also by this author

Is Growth Moral?

Nicholas Agar is Distinguished Visiting Professor at Carnegie Mellon (Australia). His most recent book was How to be Human in the Digital Economy (MIT Press, 2019) translated into Italian as Non Essere Una Macchina (LUISS University Press, 2020).

Click to

View Comments

blog comments powered by Disqus