This article (it’s much too long to call a column) got its start as my notes for a November 20, 2001 presentation to the Centers for Disease Control and Prevention (CDC), United States Department of Health and Human Services, in Atlanta, GA. It includes more information than I had time for in the presentation, so I decided to revise it as an article and post it to my web site.
When I originally prepared for my presentation, we were in the middle of the 2001 anthrax crisis. Because my client was CDC, I focused on CDC examples, especially bad examples. Much has changed since then. The crisis has passed, at least for now. Also, I think CDC’s performance – which wasn’t bad to start with – has greatly improved. (I’d love to take credit for the improvements, but CDC was learning from experience and from other advisors as well.) I have updated some references, and I’ve cut most of the CDC-focused bad examples, especially where they are no longer descriptive of CDC’s approach. Finding the right verb tenses has been hard. I’ve used the past tense for references to the 2001 anthrax attack, and the present for more general discussions of how to handle risk communication about bioterrorism.
Obviously, the opinions expressed here are not necessarily those of CDC or the U.S. government. If you’re reading this in 2002 or thereafter, they may not even be my opinions any more.
My wife, Dr. Jody Lanard, has long been an important collaborator for me; we discuss nearly everything I write both before and after I write it. But issues of terrorism have propelled me into deeper psychological waters than risk communicators usually need to explore. Jody, a psychiatrist, has been my guide in these waters. Large parts of this essay simply couldn’t have been written without her. I am also grateful to the many people at CDC who gave me feedback on my presentation, my notes, and my advice; and to Regina Lundgren, who reviewed the notes at my request and provided valuable suggestions.
Column Table of Contents
To make sense of my recommendations for risk communication about bioterrorism, I must first outline two distinctions fundamental to my approach: hazard versus outrage, and public relations versus stakeholder relations. Bioterrorism is atypical with respect to both of these distinctions, in ways that make risk communication about bioterrorism different from most risk communication. This introductory section will also explain a fundamental concept in most risk communication: the seesaw.
Hazard versus Outrage
Risk communication is at least two quite different tasks. When a risk is small in the experts’ judgment and people are excessively alarmed, the job of risk communication is to reassure them. When the experts consider the risk substantial and people are insufficiently concerned, the job is to warn them.
The assumption that underlies both tasks is that people are likely to get the risk wrong. If you construct two lists of potentially risky situations in order of how serious the experts consider them and in order of how much they concern the public, the correlation between the two lists is a disappointingly low 0.2 – accounting for a mere four percent of the variance. In other words, the risks that do damage and the risks that raise concerns are completely different. Risk communication tries to improve this correlation.
It helps to understand why the correlation is so low. The factors that determine the public’s response to a risk, it turns out, are genuine characteristics of that risk. But they have very little to do with its associated mortality, morbidity, or ecosystem damage. These are what the experts respond to. The public responds far more to factors like the risk’s voluntariness, control, dread, familiarity, and memorability; and to the extent to which the institutions associated with that risk seem trustworthy and responsive. I usually call these “the outrage factors.” I distinguish two aspects of a risk: its hazard, which determines the experts’ response; and its outrage, which determines the public’s response. When outrage is high, people will be concerned, independent of hazard. And when outrage is low, they won’t, again independent of hazard.
It follows that when risk communicators are trying to warn people, one of the main things they do is try to increase the outrage. And when risk communicators are trying to reassure people, reducing the outrage is a key task.
Public Relations versus Stakeholder Relations
Public relations assumes an audience that is apathetic but credulous. Because the audience is apathetic, grabbing its attention isn’t easy; one of the key skills of PR practitioners is to concoct eight-second sound bites that the newscasts will carry and the viewers will watch. Because the audience is credulous, that eight-second sound bite is like a free kick in soccer; the audience will absorb what it watches without resistance.
Stakeholder relations makes the opposite assumptions. Getting stakeholders’ attention isn’t a problem. They come to a public meeting and stay till midnight. But while stakeholders are attentive, they are no longer credulous. Instead, they’re resistant — worried, skeptical, even hostile. There are no free kicks in stakeholder relations.
It follows that the two fields have evolved very different communication strategies. (There are PR people who are good at both, of course; I’m comparing two approaches, not necessarily two groups of professionals.) Public relations means deciding what your key message is, then boiling it down to something short and memorable. Everything aims at reaching people who don’t especially want to be reached, at deciding how to get the most out of your limited time. When critics raise problems, PR tries to deflect, to get back “on message” as fast as possible, to sell your strengths. Stakeholder relations, on the other hand, means addressing the resistance. Stakeholder relations people don’t need to be skilled at getting an audience’s attention; they need to be skilled at diagnosing and then assuaging its worry or skepticism or hostility. The key tools of stakeholder relations are things like acknowledging problems, apologizing for missteps, and sharing control. For stakeholder relations, figuring out what to say is a matter of deciding how to address people’s doubts. Ignoring or minimizing their doubts is disastrous.
Almost by definition, insufficiently concerned people are apathetic but credulous – so alerting people is usually a kind of public relations. Excessively concerned people are attentive but resistant – so reassuring people is a kind of stakeholder relations.
How Is Bioterrorism Different?
In the anthrax attacks of late 2001, these two distinctions (hazard versus outrage and PR versus stakeholder relations) played out in unusual ways – not unique, but unusual enough to require adjustment to the conventional strategies of risk communication. I suspect the same will be true of future bioterrorism risks, so the differences are worth exploring.
First of all, the distinction between public relations and stakeholder relations virtually disappeared. Usually large groups of people are apathetic, and public relations is the skill required to reach them. If you’re dealing through the media, you’re doing PR. Stakeholder relations is reserved for much smaller groups of involved people, the ones who come to meetings or write you letters. But once in a while something happens so engrossing that the mass public turns into a stakeholder public. It can be a fairly trivial event, such as President Clinton’s adventures with Monica Lewinsky. Or it can be a huge event, like September 11.
Any use of biological agents as weapons of mass destruction – in fact, anything even close to that – will instantly turn the public into stakeholders. Even comparatively unconcerned people will be concerned enough to pay attention, watching for signals that they ought to hike their level of concern. Most people will be vigilant. Many will be hypervigilant, spending their days and even their nights with CNN. And the people who aren’t paying attention may not be apathetic at all, but something much more complicated: too terrified to watch, or shocked into denial.
As an information source at such a moment, you will find yourself talking to millions of people via the media – but they will be reacting like the nearest neighbors of a Superfund site upset about what looks like a cancer cluster. You’re on television, not at a meeting; but the audience is nonetheless attentive and skeptical, not apathetic and credulous.
This fact has an important corollary. Unless they adjust for the unusual situation, PR people may give inapplicable advice during a bioterrorist attack. And politicians may be misled by their own finely honed PR skills.
What happens to the hazard-versus-outrage distinction during a bioterrorist attack? Usually, the range of possible reactions to a risk runs from apathy at one extreme to outrage at the other extreme. “Outrage” includes fear, anger, concern, etc. These emotions are usually experienced without distortion; outraged people know they’re outraged, and know what they’re outraged about. But sometimes – and a bioterrorist attack is surely one of those times – the level of concern moves beyond normal outrage. One possible version of this “extreme beyond the extreme” is panic. A more common version (because panic is relatively rare) is denial. Risk communicators don’t usually need to think too much about how to address panic and denial; apathy and outrage are our daily adversaries. But panic and denial – especially denial – are relevant here.
Complicating things further is the phenomenon of projection. When outrage about Issue X gets projected onto Issue Y, you can’t address Y effectively without finding a way to get X into the room. In late 2001, the handful of anthrax-laden letters were Y; the potential for a much bigger anthrax attack, or a smallpox attack, or a terrorist attack of a nonbiological nature, was X.
In the case of bioterrorism, outrage is also complicated by other emotions that can masquerade as outrage. These other emotions fuel the outrage, and unless they are addressed, the outrage can’t be addressed. Injured self-esteem is one such emotion – for example, guilt that I can’t adequately protect my children; or hurt that, as an American, I am so hated by others they want to kill me. These ego injuries are psychologically intolerable, and so are projected; in late 2001 they looked like excessive concern about anthrax. Another reaction that complicates and fuels the outrage at bioterrorism is the opposite of ego: empathy. Much of what looked like fear of terrorism in the wake of September 11 was actually much closer to shared misery. It’s not that we expected to get killed by terrorists; we expected others to get killed by terrorists, and we thought we would have to watch it on CNN with our families. This empathic identification explains why many were not comforted by the statistics about how many pieces of mail aren’t contaminated with anthrax. United we stand; united we identify. My mail carrier, myself.
Because bioterrorism is not a typical risk communication issue, I am less confident than usual about my recommendations. Even at best, risk communication is more art than science, though it is more science than it was twenty years ago. I have been struggling since September 11 to adjust familiar paradigms to new situations. I’m not sure I’ve got it right yet.
The Seesaw of Risk Communication
Whenever people are ambivalent about something, they resolve the ambivalence by favoring whichever side is neglected elsewhere in the communication environment. I call this “the seesaw of risk communication.” If you focus on A, we focus on B; if you focus on B, we focus on A. The problem is that communicators may not realize they’re playing seesaw; they may suppose the game is follow-the-leader. So they emphasize the half of the ambivalence they hope the audience will emphasize – and end up preempting the seat on the seesaw they should have left empty.
There are several seesaws relevant to anthrax and bioterrorism. The most fundamental one is the reassurance seesaw. If you tell us not to worry, and seem not to be worried yourself, that makes us worry more; if you’re worried, we can relax. Blame is another relevant seesaw; institutions that blame themselves are less likely to get blamed by others. Some other bioterrorism seesaws will come up when I get to my recommendations.
Short-term use of the seesaw means riding the side you don’t want your audience to ride. If you want to calm our fears, for example, you should express those fears for us, so that we will feel less compelled to insist on them. But the long-term strategy is more subtle. Long-term, you move to the fulcrum of the seesaw, thereby moving your audience to the fulcrum as well — so that, with your help, we are better able to tolerate the ambivalence. The risk communication seesaw is counterintuitive, but whenever people are ambivalent, the game is seesaw. Reassuring us makes us feel shunted aside; you’re busy and calm, and we’re worried and useless. It leaves us alone with our fears. Instead, you need to share our fears and help us manage our fears – not urge us to abandon them.
Based on these principles, I have a list of 26 recommendations for risk communication about a bioterrorist attack. I will oscillate between well-established findings and more speculative extrapolations to what’s unusual about bioterrorism. I’ll start with some findings I’m relatively confident about, and I’ll warn you when I move into uncertain territory.
1. Don’t over-reassure.
This first recommendation derives directly from the seesaw. People are ambivalent about bioterrorism risks. Bioterrorism is high-outrage: It is (among other outrage characteristics) catastrophic, unknowable, dreaded, in someone else’s control, morally relevant, and memorable. Yet people recognize that their personal risk, statistically, is quite low so far. Hence the ambivalence. So reassuring them – riding the confident seat on the seesaw – backfires; it forces the public onto the worried seat.
The paradoxical intervention is the one that works. Tell people how scary the situation is, even though the actual numbers are small. And watch them get calmer.
A stunning example of this principle at work has been the Risk Management Program of the U.S. Environmental Protection Agency. Under the RMP regulation, chemical plants and similar manufacturing facilities were required (in essence) to figure out the worst possible accident they could have – and then tell their neighbors about it. A worst-case scenario is by definition high-magnitude; almost by definition it is also low-probability. These two truths became the two seats on the RMP seesaw. Companies that insisted the risk was low-probability, “so unlikely it’s not worth worrying about,” found their neighbors insisting worriedly on its high magnitude. Often they ended up in contentious negotiations over what prevention and preparedness steps they were willing to take. Companies that understood the seesaw, by contrast, kept their focus on the risk’s high magnitude: “If this happens and this happens and this happens, all on a day when the wind isn’t blowing and the fire department’s on strike, just look how many people we could kill!” After a stunned half-minute staring at the plume map, someone would raise his or her hand and ask, “But isn’t this really unlikely?” “Well, yes,” the smart company replied. “But just look at how many people we could kill!” In well under an hour, the typical community audience would pile onto the calm seat of the seesaw, uniting behind the principle that the company should grow up and stop wasting everybody’s time with these vanishingly unlikely scenarios.
Even if over-reassurance worked (which it doesn’t) it is important to remember that an over-reassured public isn’t your goal. You want people to be concerned, vigilant – even hypervigilant, at least at first. You want accurate, calm concern. Over-reassurance is not the way to get that.
2. Put the “good news” in subordinate clauses.
The first recommendation doesn’t mean that you shouldn’t give people reassuring information. Of course you should! But don’t emphasize it. Especially don’t emphasize that it is reassuring – or you will trigger the other side of your audience’s ambivalence.
One very good approach is to put the good news in subordinate clauses, with the more alarmist side of the ambivalence in the main clause. “Even though we haven’t seen a new anthrax case in X days, it’s too soon to say we’re out of the woods yet.” The main clause says how seriously you are taking the situation, how aggressively you are responding to every false alarm. The subordinate clause says why this series of attacks is probably all but over; it gives people the information they need to tell you that things are getting better, even that you’re overreacting.
This isn’t a grammar lesson. What I’m calling the subordinate clause can be a separate sentence or even a separate paragraph. The point is to make sure the public has the data it needs to put the risk into a reassuring context, to decide for itself that the worst is over or the odds are favorable or whatever – but you provide the reassuring data without mounting the reassuring seat on the seesaw, without minimizing the risk or urging your audience not to worry.
3. Acknowledge uncertainty.
There’s another seesaw at work in any bioterrorism crisis: Do our government leaders (CDC, FBI, FEMA, etc.) know what they are doing? When the situation is chaotic and uncertainty is high, someone is going to point out that the experts and authorities are feeling their way. Ideally, that someone should be the experts and authorities themselves.
As a consultant, I have long noticed that clients trust me more when I say I’m not sure. (Reread the last paragraph of the “How is bioterrorism different?” section.) The public will trust you more when you do the same.
Of course sometimes there is no uncertainty to acknowledge. When you are certain, say so. When you’re almost certain, say that. And when you’re feeling your way – which is inevitably going to be often – say that. (Sounding less confident than you really are is just as bad an idea as sounding overconfident. But experts don’t usually make that mistake in a crisis.) Professionals are taught to sound confident even when they are not. We imagine that this inspires trust. And when people are feeling extremely dependent (patients in a health crisis; the whole society in a public health crisis), it does. But when something goes wrong — and things always go wrong – the overconfidence backfires badly. On an individual level, doctors who share their uncertainty, who make the patient into a collaborator, who work against inflated expectations, are less likely to be on the receiving end of malpractice suits. On the societal level, leaders who acknowledge uncertainty, who come across as more humble than arrogant, are less likely to be accused of errors they didn’t make, and more likely to be forgiven for the errors they made.
Bottom line: Reserve the word “confident” for things you could bet your mortgage won’t turn out more slippery than you thought. Nine times in ten, changing “confident” to “hopeful” will improve your risk communication, help insulate you from attack, and (the paradox of the seesaw) inspire confidence in the rest of us.
Michael Osterholm does some of the best risk communication I have seen from medical sources. Here he is talking about anthrax in the October 31, 2001 New York Times: “This is a classic who, what, when, where, and why…. We are going to have to start getting used to this uncertainty in the short term, because it is going to take a while for these cases to be fully investigated.” I love that “we” – it puts us all in the same boat. Osterholm isn’t just acknowledging uncertainty. He’s facing it with us, and thus he’s helping us bear it.
You get only part credit for acknowledging uncertainty in the abstract (“we are on a fast learning curve”) – especially if it’s in the past tense (“we had to learn a lot over the preceding weeks”). For full credit, in fact, the acknowledgment needs to show your distress and acknowledge your audience’s distress: “How I wish I could give you a definite answer on that….” “It must be awful for people to hear how tentative and qualified we have to be, because there is still so much we don’t know….”
Even if unacknowledged, uncertainty provokes very little criticism of people in authority during a crisis; we are too dependent on your protection at that point to dare to question your competence. Even the media don’t want to question the leadership’s competence during a crisis. Journalists will often ask whether they are getting told the truth, rarely whether you know the truth. But when the crisis is over, the bill comes due: Inevitable stories about what you mishandled or nearly mishandled. (The most hostile media criticism of CDC’s handling of the 2001 anthrax attacks was probably the coverage of its uncertainty about what exposed-but-not-sick patients should do when their sixty-day course of antibiotics was over. Significantly, the story emerged weeks after the last anthrax case was found.) Track any risk crisis, from Three Mile Island in 1979 to anthrax in 2001, and you’ll find the critical assessments of crisis management begin a few days after the crisis itself abates, and build from there. Just how critical these assessments are depends only partly on performance. It depends at least as much on attitude. Your performance looks a lot better if you have been acknowledging uncertainty, apologetically and self-critically, from the outset.
But acknowledging uncertainty does not mean giving the impression of being out of your depth, indecisive, or terminally self-deprecating. We want our leaders to be confident but not overconfident in the face of uncertainty. We want them to tell us the unpleasant truth about the situation’s uncertainties and the need to act without all the facts – but we want them to bear this truth and help us bear it too. What we most value in our leaders, in fact, is the ability to function in ambiguous realities without collapsing into timidity and without arrogantly wishing away the ambiguities. If you had to choose between timidity and arrogance, arrogance is probably the lesser sin for a leader in mid-crisis. But it is also the more common sin. And you don’t have to choose. Temper the impression of overconfidence by acknowledging uncertainty. If you are typical of my clients, you are likely to sound excessively humble to yourself while you’re still sounding overconfident to the rest of us. So don’t worry too much about going too far in the direction of humility. The likelier risk is that you won’t go far enough.
Copyright © 2001 by Peter M. Sandman