This is the sixteenth in a series of risk communication columns I have been asked to write for The Synergist, the journal of the American Industrial Hygiene Association. The columns appear both in the journal and on this web site. This one was substantially abridged for publication in the April 2008 issue of The Synergist, pp. 35–37. The longer original version is here.
While this column was awaiting publication in The Synergist, I received a Guestbook question on a relevant issue I had neglected to cover. See “Responding to damaging rumors when the information is confidential.”
In colonial times newspaper “correspondents” were often just acquaintances of the publisher writing home from their travels. Unable to confirm or disconfirm their reports, cautious publishers printed them under the headline: “Important If True.” That’s how I like to think of rumors.
In your life experience, how often have rumors turned out to have at least a germ of truth? One-fifth of the time? One-third the time? More than that? Odds are you have learned neither to trust the rumor mill nor to ignore it. If the content of a particular rumor matters to you, if it’s “important if true,” you try to check it out. And if you can neither confirm nor disconfirm it, you may well take some preliminary action based on the possibility that the rumor could turn out accurate.
And yet the risk communication literature on rumors is almost exclusively about the importance of “controlling,” “correcting,” or “squelching” them. I want to talk about that too. But I want to start with the importance of investigating them.
When bird flu broke out in Vietnam in January 2004, the World Health Organization (WHO) issued an international public health alert. One result of the alert was a rash of rumors about animal and human outbreaks around the world. A March 2005 journal article entitled “Rumor Surveillance and Avian Influenza H5N1” describes how WHO’s Western Pacific Regional Office (WPRO) responded.
WPRO designated a “rumor surveillance officer” to seek out and track down bird flu rumors. Over a five-week period 40 different rumors of bird flu outbreaks in WPRO countries were identified. Nine of them – 24% – turned out to be true.
As a result of its rumor surveillance efforts, WPRO was able to squelch false rumors, reducing their potential economic and psychological impact. Even more important, it was able to verify accurate rumors and mobilize international efforts to cope with outbreaks that the affected national governments weren’t yet aware of or weren’t yet willing to acknowledge. As the article’s abstract puts it, WPRO’s rumor surveillance program both “informed immediate public health action and prevented unnecessary and costly responses.”
Rumor surveillance isn’t new. The WPRO article mentions previous studies of how rumors were used as a tool of public health during the Chernobyl nuclear accident in 1986 and an outbreak of Ebola in Uganda in 2000. Others at WHO have told me that rumors played a key role during the SARS outbreaks of 2002–2003. There, too, confirming accurate rumors turned out to be even more valuable than disconfirming inaccurate ones.
Of course rumors often turn out somewhere in the middle: Many of the details are wrong, but something is happening that calls for a response. Even a wholly mistaken rumor has a kind of “truth”: It reveals important insights about how people are thinking and feeling, and what sorts of allegations they find credible.
Despite all this, far too many rumors are ignored. In the aftermath of a serious accident, it is commonplace for investigators to learn that there were prior rumors of possible problems that officials shrugged off. Of course this is a biased sample; nobody investigates the cases where rumors are shrugged off and then nothing happens. I’m not claiming that rumors are always true, or even that WPRO’s rumor accuracy rate of 24% is universal. Maybe rumors of health, safety, and environmental problems where you work are right only five or ten percent of the time. I’ll bet they’re right, or partly right, often enough to be worth taking seriously.
Good risk managers use rumors as a valuable early warning system. Like WPRO, they maintain rumor surveillance programs to identify rumors quickly, and then they investigate.
Rumors on the Web
Rumor investigation is hard work – but rumor surveillance has never been easier. The most localized rumors still spread via word-of-mouth; to detect them you need your ear to the ground. Everything else is on the Web.
Before consulting with a new client, I typically spend an hour or two in cyberspace looking for dirt. Obviously I can’t trust what I find; I have to remember that most of it probably isn’t true. But it’s incredibly easier to find than it used to be. Which means, of course, that it’s also incredibly easier to spread than it used to be.
The Web has changed at least two things about rumors. First, fringe sources can now reach everybody worldwide just like establishment sources – and can even become establishment sources (witness the rise of Matt Drudge). Access has been democratized. And second, fringe sources now look a lot like establishment sources. It used to be easy to distinguish the stuff that came from reputable publishers from the stuff that was mimeographed in somebody’s garage. Not any more. Professionalism of appearance has been democratized.
When I taught college in the 1970s and 1980s, my students’ main research problem (and mine) was getting information. Now the main problem is vetting information.
Think again about WPRO’s bird flu rumor surveillance. If bird flu ever mutates so it can spread easily from human to human and cause a pandemic, I absolutely guarantee that the first people to tell you the pandemic has begun will not work for the World Health Organization or the U.S. Centers for Disease Control and Prevention. The WHO and the CDC will quickly pick up the pandemic rumors, both from the Web and from their other rumor surveillance sources. Then they’ll spend days, weeks, or months investigating the rumors and debating their significance. Meanwhile, everyone else who’s seriously interested in pandemic preparedness will be hearing the same rumors, mostly on the Web. Medical journalists, local government officials, corporate risk managers, industrial hygienists, and ordinary citizens will all have to decide which action steps they choose to take before the WHO and the CDC certify the pandemic, and which steps they choose to take only after official validation of the pandemic rumors.
The previous paragraph is written in the future tense. But there have been plenty of pandemic rumors already. A number of websites have announced the start of the next influenza pandemic several times and counting. So far they’ve been wrong. But eventually they will be right. They won’t be believed for a while, both because of their lack of official standing and because of their habit of crying wolf. But they will have been first.
Similarly, the information that dimethylmeatloaf is a human carcinogen will be an unconfirmed rumor on the Internet long before the International Agency for Research on Cancer reclassifies it. Odds are there’s already an unconfirmed rumor on the Internet about the carcinogenicity of dimethylmeatloaf.
It’s a fine art to distinguish a rumor that is almost certainly false and probably not worth investigating from a rumor that has the ring of truth and is certainly worth investigating (and maybe worth taking precautionary action before you know if it’s true or not). Vetting information – distinguishing among rumors – is fast becoming a core skill for industrial hygienists and other risk managers.
In sum, more and more professionally important risk information is available to you first in the form of rumors on the Internet. Rumor surveillance is thus part of your job. And once a risk rumor has come to your attention (whether you heard it on the Internet or on the shop floor), you face decisions about whether to dismiss it or take it seriously, whether to investigate it or wait for others to investigate it, and whether to take any action in the meantime.
Responding to rumors
And of course you face another decision about every risk rumor you hear: whether to talk about it.
Everyone agrees in principle that accurate rumors should be confirmed. Of course each day brings new examples of organizations that have chosen to stonewall instead, either by ignoring a rumor they know to be true or by actually denying it (a more proactive sort of dishonesty). But nobody argues that that’s the right thing to do.
But suppose the rumor is false. Not just mistaken about some of the details, but down-to-the-bottom false. Not just unconfirmed and you hope it’ll stay that way, but demonstrably flat-out false.
Here the conventional advice has improved. Not too long ago, communication professionals often counseled their clients to ignore false rumors. “Don’t dignify them with a response.” This profoundly unwise recommendation has slowly given way to the far sounder principle that false rumors should be acknowledged and rebutted. The old advice still surfaces from time to time. The Agency for Toxic Substances and Disease Registry, for example, has a “Primer on Health Risk Communication” that unwisely suggests: “Don’t respond to rumor.” By contrast, the “Public Health Communications” section of the U.S. Pandemic Influenza Plan tells officials to “monitor news media reports and public inquiries to identify emerging issues, rumors, and misperceptions and respond accordingly.” (I’m assuming “respond accordingly” is meant to mean address the rumors you find, not ignore them.)
A good response to a false rumor has six components:
- Repeat the rumor you’re rebutting. If people are hearing X from their friends and coworkers, you can’t just say Y instead, as if you hadn’t even heard the rumor. You need to start by saying, “Yes, I heard X too.”
- Be empathic toward those who believe the rumor. You don’t want to validate that X is true, since your point is that it’s false – but it helps to validate that people aren’t stupid to think X might be true.
- Demonstrate that you have taken the rumor seriously. Many people expect official sources to deny damaging rumors whether they’re true or not. So if you want your denials to be credible, show that they’re not knee-jerk. “Here’s what I did to look into the rumor…. And here’s what I learned….”
- Give evidence that the rumor is false. Your evidence may be quantitative data. It may be quotations from credible third parties. It may be anecdotal. Ideally, it will be all of the above. Don’t expect people to take your word for it.
- Discuss all evidence that the rumor is true. Assume people have heard or will hear the other side’s most persuasive arguments. They will loom all the larger if you haven’t mentioned them. If the evidence is 95% on your side, don’t claim it’s 100% on your side. Talk about the discrepant 5%.
- Promise to stay alert. Good science is always tentative, and so is good risk communication. “Even though there are no signs of X so far, I am keeping an open mind. If the situation changes, here’s how I’ll know…. And if that happens, I will announce it immediately.”
While the experts agree now that a widespread false rumor should be rebutted, not ignored, it’s a tougher call if the “rumor” isn’t widespread. A 2002 U.S. government manual on “Communicating in a Crisis,” for example, advises: “If a damaging rumor is confined to a small audience, correct it within that group, [but] don’t create a major public event.” I agree that if a false rumor isn’t spreading, the harm done by circulating it to a wider audience may outweigh the benefit of rebutting it. But rumors tend to spread; in this Internet age, they spread all the more quickly. And wishful thinking makes us likelier to think a rumor is confined when it’s actually spreading than vice-versa. Unless you’re pretty sure a false rumor isn’t going anywhere, it’s safer to address it than to assume it’ll stay confined.
When a rumor has elements of truth and elements of falsehood, you need to make the distinction. Be sure to pay at least as much attention to the truthful elements as the false ones. It’s often tempting to rebut the details the rumor got wrong, while neglecting to acknowledge the pieces that were right. You’ll get further if you start with what’s true. And if most of it’s true, your corrections of the inaccurate bits should feel like footnotes, not rebuttals.
Suppose you don’t know whether a rumor is true or false. If you’ll know in a few minutes or even a few hours, it may make sense to keep mum while you investigate. But if it’s going to take days or weeks or months to get to the bottom of the story, don’t wait. Talk now. “I’ve heard that rumor too, and I’m trying to track it down. I don’t have a lot to report yet. Here’s what I’ve learned so far.... Here’s what I’m doing to learn more.... In the meantime, here’s what I think we should do....”
Of course your inability to confirm or disconfirm the rumor will make people uncomfortable, just as it makes you uncomfortable. By commenting on a rumor you can’t confirm or disconfirm, you are in effect spreading the rumor. But the alternatives are worse. Denying a rumor that may turn out accurate in the end is disastrous. So is confirming a rumor that may turn out mistaken. And refusing to comment leaves it to others to interpret not just the rumor but also the meaning of your silence. Does it mean you haven’t even heard the rumor? Does it mean you know the rumor is true but don’t dare admit it?
Most important, refusing to talk about an unconfirmed rumor leaves you out of the discussion of how to respond. Rumors about risk typically require such a discussion. Are there precautions we should be taking now, even though we’re not sure the risk is real, or should we wait till we have better data? If a rumor is circulating that there’s a fire in the building, the right response is probably to evacuate the building and call the fire department. If a rumor is circulating that a bunch of people in the building have all been diagnosed with the same rare cancer, the right response is probably to look into the rumor before taking any precautionary action. In both cases you want to be part of the discussion about how to respond. To join the discussion, you will first need to acknowledge the rumor.
If you haven’t a clue whether a rumor is true or false, that’s about all you can say. But if you believe it’s probably one or the other, you can say that too. And I think you should – being careful to make sure everyone hears that you’re not sure yet, that you’re weighing uncertain evidence, that you may turn out wrong in the end. This is controversial advice. Many communication professionals think it’s unwise to speculate. I think it’s impossible to talk coherently about risk without speculating – that is, without expressing an opinion about the probability of bad things happening. When you’re talking about a risk rumor, there’s a huge difference between “this hasn’t been proved conclusively yet, but the available evidence says it’s probably true” and “this hasn’t been disproved conclusively yet, but the available evidence says it’s probably false.” The appropriate level of anxiety is different, and so is the appropriate precautionary response. A good risk communicator, in my judgment, helps people appreciate this distinction.
The classic research on rumors was a 1947 book by Gordon W. Allport and Leo Postman, entitled The Psychology of Rumor. Allport and Postman called their most far-reaching assertion “the basic law of rumor.” It declared that the extent of rumor-spreading is the product of two factors: the ambiguity of the evidence times the importance of the subject to the individuals concerned. We circulate rumors about feared consequences (as opposed to wished-for consequences) when we’re uncertain because the situation is ambiguous, and anxious because it’s important.
It follows that the best way to prevent people from spreading rumors is to spread information instead. Information is the antidote to rumor.
What if you don’t have any information? You always have at least two pieces of information to give people about a rumor: the information that there is a rumor circulating that says X, and the information that the best available evidence so far about X says ... whatever it says. And there’s a third piece of information that people already have, grounded in the history of your management of prior rumors: the information that when you know a rumor is true you say so, when you know it’s false you say so, when you’re not sure you say so, and when you haven’t a clue you say so. Those three piece of information can go a long way toward reducing uncertainty and anxiety – and thus toward preventing rumors.
This is the crucial paradox of rumor management. Regardless of whether a rumor is true, false, a mix of the two or impossible to judge, when you talk about it honestly and transparently the rest of us feel less compulsion to talk about it irresponsibly.
Copyright © 2008 by Peter M. Sandman