Posted: December 29, 2001
This page is categorized as:    link to Crisis Communication index
Hover here for
Article SummaryAccustomed to naturally occurring diseases, the U.S. Centers for Disease Control and Prevention (CDC) had a difficult time coping with the anthrax bioattacks of late 2001. Since risk communication was one of its core problems, it asked me to come to Atlanta and help. This four-part “column” grew out of my Atlanta notes. If the CDC was adjusting to bioterrorism, so was I. I put aside my usual outrage management recommendations and developed 26 recommendations specifically on the anthrax crisis. These became the basis for my (sadly) expanding work in crisis communication, and for the crisis communication CD-ROM and DVD Jody Lanard and I ultimately put out in 2004. This column was my first extended discussion of most of these crisis communication recommendations, and it is my only published assessment of the CDC’s anthrax communication efforts.

Anthrax, Bioterrorism, and Risk Communication: Guidelines for Action

(Page 3 of 4 – Return to page 1 link up to index)

10. Surface the underlying fear of future bioterrorism.

link up to indexVirtually every bioterrorism expert I have listened to eventually gets around to saying that the anthrax attacks of late 2001 were a pilot project – that they have learned a lot, that they realize the bad guys have also learned a lot, that they expect another and much tougher bioterrorism challenge sooner or later. Virtually every nonexpert I have listened to says the same thing.

And yet reassurances about the 2001 attacks very seldom included this perspective. And so they sounded hollow. (Ironically, the growing belief that the source of the anthrax was domestic and individual made it harder for people to come to terms with the organized international bioterrorism they fear. It drove the suppressed fear deeper.) Even the word “outbreak” is part of the problem, as is any word that tries to keep the threat of a large-scale, organized bioterrorist attack out of the room. I would use the word “bioterrorism” as often as possible.

Understand that one of the main reasons many people overreacted to the 2001 attacks was because overreacting was a way of binding anxiety about much more serious potential attacks in the future. (People overreacted, but notice: They didn’t panic.) There were three possible responses to the attacks: (a) Pretending that they were no big deal – a common reaction, symptomatic of denial. (b) Recognizing them as a precursor of a terrifying possible future — that, I think, is realism. (c) Treating them as super-serious in their own right – the middle course, a compromise between realism and denial.

A small bioterrorism crisis is thus a harbinger of a big bioterrorism crisis. People see it that way. They are right to see it that way. But they shy away from their own clear vision, retreating either into denial or into a halfway position that overreacts to the small crisis and ignores the implicit threat of a bigger one. Responding agencies like CDC should do what they can to bring the underlying concern to the surface. For example, when reporters ask whether you have evidence of a widening outbreak in other cities, that is a chance to say, “No, but we keep looking. It is everyone’s nightmare that what we have seen so far could be the pilot project, the tip of the iceberg. If there is worse to come, we need to catch it fast. Here’s what we’re doing to make sure we do catch it fast….”

More generally, it is good advice to try to be sensitive to the underlying concern behind any question. Don’t just answer the question; address the concern. The concern that CDC is overtaxed may hide behind a question about how many people the agency has assessing samples. The concern about a possible cover-up may hide behind a question about why a result hasn’t been released yet by local authorities. Of course, it isn’t always easy to figure out what the underlying concern is. And it isn’t always useful to ask; journalists generally won’t tell you, and people in denial can’t tell you. The main thing is to ask yourself. Especially when questions keep coming up that strike you as unimportant or off-the-wall, look for an unspoken common denominator that would give them meaning. Then try answering the unasked question. If you guessed right, tensions should ease; maybe the underlying issue can be addressed directly now, or maybe it has been addressed enough and the questioning will move on. And if you guessed wrong, the questioner will simply shrug off your irrelevant comment.

But in the midst of an objectively small bioterrorism crisis, the key underlying issue – the elephant in the room – is bound to be the possibility of a big one. As the experts plan for a future where bioterrorism risks are ever-present, they need a public that is planning for the same future. They need to say that that is, indeed, the future they are planning for.

11. Be gentle about that awful future.

link up to indexI have to qualify the previous recommendation. It is true that we need to get the underlying fear of future bioterrorism to the surface. This is one of the “other shoes dropping” that people are waiting for; when the experts say so, people will breathe a sigh of almost-relief, like having a bad diagnosis confirmed.

But just as when you’re confirming a bad diagnosis, you need to give people this bad news gently – clearly and crisply, unmistakably, but still gently … gently enough that you won’t drive people further into denial. Those who are ready to hear it, already thinking it at or near the surface, will hear it; those who can’t, who daren’t, won’t. Don’t force them to.

Elvin Semrad, a famous psychotherapist of the mid-20th century, defined therapy as a three-step process. First you help the patient recognize the source of the pain; then you help the patient bear it; then you help the patient put it into perspective. There’s actually a prior first step that Semrad didn’t usually bother to talk about: Before a therapist helps the patient recognize the source of the pain, the therapist diagnoses it – and doesn’t blurt it out, but rather strategizes how to guide the patient to it. The order of these steps is crucial. It is cruel to force the patient to recognize awful things if the patient cannot yet bear them, or if you are not ready to help the patient bear them. And it is fruitless to try to put them into perspective before the patient has recognized them and learned to bear them.

I realize that bioterrorism experts are not therapists; I’m not a therapist either. But communicating to people about bioterrorism requires a therapeutic perspective. Mourn with us that we now live in a world where bioterrorism is real, that the 2001 anthrax attacks were genuinely terrible despite the small number of cases. Help us recognize that the 2001 attacks may be a precursor to much worse attacks. Help us bear that awful reality. And then give us some perspective – what you can do to prevent such future attacks; what you can do (and what we can do) to prepare for them and cope with them.

One of many strengths of President Bush’s Atlanta speech was the frequent reminders that we can bear it: “…endured the shock … endured the sadness … faced the unprecedented bioterrorist attack….”

12. Surface and legitimate the misery.

link up to indexOne of the principal reactions to September 11 was a sense of shared misery. As I write more than three months later, this continues to be true. I think some people who say they are frightened are actually more miserable than frightened; so are some people who say they are relatively unaffected. Americans are spending more time with their families; eating more comfort foods; experiencing an approach/avoidance relationship with the news (afraid to turn it on, afraid not to). Survey research commissioned by CDC shows that most people think they’re relatively safe from bioterrorism, at least for now, even though they do expect future attacks. In other words, people expect to survive a series of bioterrorist attacks. They expect to have to watch them on CNN, and they want to stay near their loved ones so they can help each other get through those attacks.

Misery is survivable; we’re surviving it. But I think it is going undiagnosed and untreated. Telling a miserable person to calm down misses the point; we’re calm already. Telling a miserable person that the odds of survival are good even in a worst-case scenario also misses the point; we don’t expect to die – we expect to have to live through the deaths of others. Telling a miserable person to get on with his or her life is similarly off-base; we are getting on with our lives, but we’re carrying a dead weight of misery. We’re not fine.

The first step in addressing and ameliorating the misery, I think, is recognizing, acknowledging, and legitimating it – that is, sharing it. But sharing it doesn’t mean wallowing in it, or falling apart because of it. Share the misery calmly, and model that it can be borne. When you share our misery, you earn the right to help shape it … which you cannot do unless you visibly do share it.

The best example here is New York Mayor Rudy Giuliani, asked about the number of casualties just hours after the World Trade Center attacks. “More than we can bear,” he said – but he was bearing it. Giuliani’s impact in the days that followed resulted not just from his calm and his competence and his compassion, but from the fact that these traits were accompanied by his readily detectable pain, his misery.

13. Express wishes.

link up to index“I wish we knew more.” “I wish our answers were more definitive.” “I wish we could tell people one way or another that so many anthrax spores is or isn’t a potential risk.” “I wish we had made bioterrorism preparedness a higher priority before September 11.” “I wish we had thought about whether anthrax spores could escape through envelopes.” “I wish we were more skillful at coordinating with law enforcement agencies.” Above all: “I wish we were back in the pre-9/11 world.”

Just as good as “I wish….”: “If only….”

There are several benefits to expressing wishes. First, it humanizes you – you have wishes! Technical people may confuse sounding professional with sounding unconcerned and inhuman; at the extremes, an expert can sound enthusiastic and upbeat about what for everyone else is tragic – but the more common error is to sound matter-of-fact. Trust and learning are higher when you reveal your humanity.

Second, expressing wishes is a way to acknowledge your failings gently but still self-critically. When you say you wish you had done X, you aren’t quite defining your failure to do X as culpable, requiring contrition. But you are regretting the failure.

Third and most important, you can express your audience’s wishes. If your audience is aware of the wish – the wish that your answers were more definitive, for example — this shows that you share it, that you understand the downside of the news you are imparting. (Don’t say it unless it’s so. But don’t neglect to say it if it is so, just because it sounds too human.) And if your audience is in some kind of denial about the wish — the wish to be back in a country that felt itself immune to terrorism, for example — then you can help reduce the denial by voicing the wish yourself. Whenever denial is an issue, the “if only….” or “I wish….” formulation enables you to put the audience’s underlying wishes on the table. It makes them psychologically accessible without accusing the audience of them. “You wish….” is too direct, too likely to be denied and thus to deepen the denial.

I’m going back to some well-established recommendations now.

14. Stop trying to allay panic.

link up to indexThe desire to prevent or allay panic has justified most of the errors that have been made in communicating about bioterrorism – the over-reassurance; the reluctance to share dilemmas and to acknowledge uncertainty and prior error; the failure to address the long-term fears and current misery.

Panic is much less common than we imagine. The literature on disaster communication is replete with unfulfilled expectations of panicking publics. It turns out that people nearly always behave extremely well in crisis. Recall how people behaved in lower Manhattan the morning of September 11! In the face of awful events, people become simultaneously resourceful and responsive. If told what to do by those in authority, we tend to do it; if no one is in authority, we figure it out for ourselves. When the crisis is over, we may feel anxiety, fear, even delayed “panic attacks” now that the need to stay calm has passed. During the crisis, people may well take unpanicky self-protective actions that strike them as rational and appropriate, even if the authorities recommend otherwise; securing a personal stockpile of antibiotics, for example, is hardly a sign of panic – it’s a sign of hedging. Panic, in short, is rare.

The condition most conducive to panic, moreover, isn’t grim news. People are likeliest to panic (though still not all that likely) when a dire outcome seems highly probable but not absolutely certain, and they cannot tell what to do to optimize their chances of survival. When we feel the authorities are telling us the truth and it is clear what we should do, panic is unlikely … even if the truth is very bad and the optimal action isn’t very likely to work. But when we feel the authorities are giving us double-messages – we sense the risk is dire but the experts say it’s not; we can think of protective actions but the experts say not to do them – then panic becomes more likely. When authorities start hiding bad news in order to prevent panic, they are likely to exacerbate the risk of panic in the process.

The media compound the mistake. Reporters wait until after a crisis to tell people how close they came to disaster, and whose fault it was. In mid-crisis the media are surprisingly (even dangerously) committed to the same goal and the same misunderstanding as the authorities: They try to prevent panic by suppressing bad news.

One interesting though comparatively trivial example from the 2001 anthrax story: Virtually no mainstream media used the readily available photos of cutaneous anthrax. (I did see one TV clip with a closeup of a skin lesion.) Arguably, it would have been useful for the public to know what cutaneous anthrax looks like. But desperate as they were for good art to accompany the anthrax story, editors nonetheless decided that the photos were too gross, too likely to frighten readers and viewers.

15. Protect your credibility – and reduce the chances of panic – with candor.

link up to indexA lot of what I have been saying adds up to a strategy of what might be called “radical candor.” This is not just a matter of not lying. It means telling the whole truth – or as close to the whole truth as security considerations permit.

It also means avoiding statements that are technically accurate but intentionally misleading. Consider this example from the Three Mile Island nuclear accident. In the midst of the crisis, when any number of things were going wrong, the utility put out a news release claiming that the plant was “cooling according to design.” Months later I asked the PR director how he could justify such a statement. He explained that nuclear power plants are designed to cool even when serious mistakes have been made. Despite his company’s mistakes, therefore, the plant was indeed cooling according to design. Needless to say, his argument that he hadn’t actually lied did not keep his misleading statement from irreparably damaging the company’s credibility. The more recent memory of “what the meaning of ‘is’ is” should help drive this lesson home.

Let me give an example from the 2001 anthrax attacks. (Although the principle is well-established that only radical candor protects credibility in a crisis, the example I am about to give, personal stockpiling of antibiotics, is medically very controversial. I can’t resist making my case here, but even if you reject the example, please consider the principle.) Many at CDC and elsewhere devoted a lot of energy to telling people that there was no reason to pursue a personal stockpile of antibiotics, indeed that stockpiling antibiotics was medically contraindicated. Now, I do understand why taking antibiotics before they are needed is medically contraindicated (in terms of both side-effects and antibiotic resistance). But I know of only four reasons for not wanting people to stockpile, none of which is that stockpiling is foolish or unnecessary.

  • Concern about a possible shortage. This was a sound argument for a while with respect to Cipro – but not, I think, the other antibiotics involved. There was always plenty of doxycycline.
  • Fear of frightening people by acknowledging that there are scenarios where it would be necessary to put millions of Americans on antibiotics. I’ve said enough already about why that’s not a solid argument.
  • The problem that any particular attack might involve an agent resistant to the particular antibiotic someone happened to have stockpiled. This is true – but the fact that some preparations might not work is hardly a reason to recommend not preparing. People who stockpiled the “wrong” antibiotic could always be told they need to line up for another.
  • Worry that people might not have the will power to refrain from taking an antibiotic once they had it in hand.

The fourth reason, I think, is the only one that holds water. If this is why the experts opposed consumer stockpiling, they should have said so: They didn’t trust us to have the medication around without using it when we shouldn’t. Fair enough. Maybe the risk of people taking these drugs unnecessarily was greater than the benefit of people having them available if needed. But what about those of us who felt capable of filling the prescription without popping the pills? Claiming that it made no sense for us to prepare ourselves for a potential urgent need itself made no sense. Our doctors have told us to carry various just-in-case medications when we travel overseas; our children’s pediatricians have ordered us to keep some ipecac on hand….

The experts also needed to be more honest about the public’s reasons for wanting to stockpile: We don’t trust the government to handle smoothly an unprecedented distribution of emergency medication under crisis conditions. And of course the question of whether the government would be able to cope smoothly with an urgent need to distribute millions of doses — perhaps in the middle of a blizzard, perhaps in the middle of a riot – is not a medical question. My neighbors’ opinions are as valid as those of CDC’s bioterrorism specialists. In short, people who wanted the medicine on hand so they wouldn’t have to wait for it or fight for it or line up for it if the worst happened weren’t being foolish. Telling them they were rang profoundly untrue, and cost significant credibility.

In assessing this complex issue, it isn’t relevant that most people obeyed the experts’ advice not to build their own stockpiles. Some obeyed because they weren’t especially worried; some because they had been terrified into denial; some because in crisis conditions people tend to do what they’re told. Nor is it relevant that mass administration of antibiotics has so far proved unnecessary. What matters most, I think, is that millions of Americans listened as their government told them that they should not take a precaution that seemed, to many, self-evidently sensible. This sounded a lot like that hollow refrain, “everything is under control.” It exacerbated people’s concern and diminished their trust in the authorities.

From a risk communication viewpoint, antibiotic stockpiling looks like a reasonable precaution. Taking the precaution helps manage the concern. I could envision a CDC program that encouraged people to have appropriate emergency antibiotics on hand. To drive home the importance of not taking them unnecessarily, I could imagine offering a partial refund for the return or exchange of unused antibiotics when they pass their expiration date.

A postscript to the stockpiling issue: More than twenty years ago, during and after the Three Mile Island crisis, I watched a debate over whether people living near nuclear power plants should have potassium iodide on hand in case of an accident. (Taken promptly, the potassium iodide can flood the thyroid with iodine, preventing the uptake of radioactive I-131.) The opponents won, voicing pretty much the same concerns as the ones that prevailed during the anthrax attacks. Only in the wake of September 11 did the government announce that it was reconsidering, and might distribute potassium iodide to neighbors of nuclear facilities. The low credibility of the nuclear establishment in recent decades certainly can’t be laid at the door of the potassium iodide mini-issue – but the issue was symptomatic of an absence of candor that seriously undermined the nuclear industry.

A related issue, smallpox vaccination, poses similar questions of candor. The issue isn’t on top yet because the vaccine isn’t available in sufficient quantity. But the government plans to buy enough doses for everyone in the country, then keep them available for use in a smallpox attack. Individuals who wish to be immunized in advance will be told they may not, because the known side-effects of the vaccine constitute too great a risk unless there is an actual attack. The wisdom of mass smallpox vaccination versus waiting for an attack is debatable, as is the question of whether people should be entitled to make their own risk judgment. Where candor is most at stake is in how the side-effects hazard is described. The vaccine is most hazardous to the young, the old, and the immunocompromised. The risk to healthy adults is much lower. Governmental opponents of mass vaccination have tended to cite the data for the population at large. This isn’t a lie, but it does obscure the much lower risk to a healthy individual who wants an inoculation. It also ignores the fact that vaccinating only the healthy (with relatively few side-effects problems) would give considerable protection to the unvaccinated as well; it would be hard to start an epidemic if 90% of the population had been vaccinated, and terrorists would therefore be unlikely to try.

I don’t want to leave the impression that I think CDC and other government agencies managing the 2001 anthrax attacks were especially dishonest in what they said to the public. To the best of my knowledge, they were comparatively candid. I am critical of the way antibiotic stockpiling was handled – but it’s hardly a smoking gun. And what public opinion data I have seen suggest that CDC in particular remains a highly trusted source of information. The agency may need all the credibility it can muster in the months and years ahead. It is important not to spend it unnecessarily, to marshall it – stockpile it, if you will – for when it may be needed. Toward this end, “radical candor” is a tough but useful standard.

16. Err on the alarming side.

link up to indexIf you always knew all there was to know, then being candid and avoiding over-reassurance would add up to telling the whole truth. But often you don’t know the whole truth. In particular, you are asked to judge the seriousness of risks whose seriousness is not yet established. You’re going to be wrong sometimes.

You have three options: (1) Err on the reassuring side, on the grounds that you shouldn’t scare people when you’re not sure. (2) Make your best guess, as likely to be too alarming as too reassuring. (3) Err on the alarming side.

I’ve said enough already about why the first choice is a mistake. But I haven’t yet explained why the third is better than the second. (Risk managers and medical practitioners already know it’s better public policy to be conservative; they may not know it’s also better risk communication to sound conservative.) The reason is very straightforward. In a high-outrage situation, having overestimated the seriousness of a risk is a fairly minor problem; it isn’t cost-free, but its cost is low. Having underestimated the seriousness of the risk is devastating. The first time a source has to announce that “it’s worse than we thought,” much credibility is lost; the second time, it’s all lost. Having to say it’s not as bad as you thought is much more survivable.

Even I wouldn’t go so far as to argue that you should proclaim your worst-case scenario as if it were your likeliest scenario. But you definitely should present it, and be clear that you haven’t yet ruled it out. (I gave an example of this earlier, when I discussed EPA’s Risk Management Program.) And you should make sure that less extreme but still worse-than-expected outcomes are presented as likely enough that you don’t look like a liar if they materialize. I understand that this can raise serious political and even medical problems. But the only way not to be caught (in hindsight) over-reassuring the public is to be willing to be seen (in hindsight) as having been excessively cautious.

For this point I can use Three Mile Island and Iodine-131 as a good example instead of a bad example. During the crisis, the Pennsylvania Department of Health was worried that radioactive I-131 would escape from the nuclear plant, be deposited on the grass, get eaten by dairy cattle, and end up in local milk. Over a two-week period health officials issued several warnings urging people not to drink the milk. Meanwhile, they kept doing assays of the milk without finding any I-131. Their announcements moved slowly from “there will probably be I-131 in the milk” to “there may be I-131 in the milk” to “there doesn’t seem to be I-131 in the milk, but let us do one more round of testing just to be sure.” By the time the Health Department declared the milk safe to drink, virtually everyone believed it.

During the anthrax crisis, a Department of Health and Human Services spokesman expressed precisely the opposite view in an October 28 New York Times interview. Defending DHHS statements that turned out to be over-optimistic, he said: “Something that’s factual at this moment proves not to be factual in retrospect. That doesn’t mean it wasn’t factual at the time.” (In other words, when you represent your early guesses as facts, time will sometimes prove you wrong.) He went on: “You certainly don’t want to err on the side of alarm. You tend to be conservative, on the side of reassuring, and I think that’s defensible.” Leave aside his misunderstanding of what “conservative” means. His view, widely shared among first-time risk communicators, is that the proper message is somewhere between your best guess and your most optimistic guess. Experienced risk communicators recommend somewhere between your best guess and your most pessimistic guess instead.

17. Be willing to answer what-if questions.

link up to indexWhy not just decline to comment until you really know the answer? Because you can’t. Like the decision-makers, the public demands information in a crisis, even if the information is uncertain. If we don’t get it from you, we will get it from someone else – from some anonymous pseudo-expert on the Internet, perhaps.

Risk communication has gone through three distinct stages on the tough issue of how to handle what-if questions, and other questions that don’t yet have answers: To speculate or not to speculate. Early on, the custom was to give as optimistic an answer as the known facts could justify. Unless you were confident the news was bad, you felt free to imply it would probably be good. This destroyed many an organization’s credibility, and so risk communicators started advising their clients not to speculate at all. No speculation is an improvement over optimistic speculation. It is probably the right response if better answers will be available soon. But in the midst of a crisis you can’t just tell people you’ll have nothing to say until the i’s are dotted and the t’s are crossed and you’re ready to issue your technical report sometime next summer.

Learn to say something like this: “We think it’s either X or Y or Z. We’re not sure; there are other possibilities being considered, and it may turn out to be something we haven’t considered at all. But the smart money is betting on X or Y or Z. We hope it’s X – X is fairly easy to control and tends not to have many long-lasting effects. We’re praying it’s not Y – if it is, we’re in for a long, hard battle. And Z is somewhere in the middle. Here’s what we’re doing to learn more, and when we expect to have some better answers.…”

Copyright © 2001 by Peter M. Sandman

For more on crisis communication:    link to Crisis Communication index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.