Posted: November 11, 2004
This page is categorized as:    link to Outrage Management index   link to Crisis Communication index
Hover here for
Article SummaryThis column is in two parts. Part One lists some basic tips for overcoming the universal temptation to sound overconfident; it’s a primer on how to sound uncertain instead. Part Two goes into detail on the toughest part of acknowledging uncertainty: deciding just how uncertain you ought to sound, and then coming up with words (or numbers) that capture the right level of uncertainty. It assesses five biases that tend to distort our judgments about how uncertain to sound, even after we have accepted the principle that we should acknowledge our uncertainty. Compare “I can’t guarantee that it’s safe” with “I don’t know if it’s safe.” Both acknowledge uncertainty – but very different levels of uncertainty. Which of the two is likelier to get said when the other would have been closer to the truth?

Acknowledging Uncertainty

This is the sixth in a series of risk communication columns I have been asked to write for The Synergist, the journal of the American Industrial Hygiene Association. The columns appear both in the journal and on this web site. Only the first third or so of this column appears (more or less identical except for copyediting details) in the November 2004 issue of The Synergist, pp. 21–22, 41. The section on “Being Precise about Uncertainty” wasn't published in The Synergist.

Most people hate uncertainty. They’d much rather you told them confidently and firmly that A is dangerous or B is safe, C is going to happen or D isn’t, E is a wise precaution or F is a foolish one.

Of course anything you tell them confidently and firmly that turns out false generates outrage (fear, anger, or both). With some justice, people will feel you misled them. The result will certainly damage your credibility, and thus your ability to manage this situation and future situations. Perhaps worse, people’s mistrust of you will rub off onto their attitude toward the risk itself.

But that doesn’t keep them from pressuring you beforehand to sound more certain than you are. In fact, the pressure comes from all sides: affected industries, politicians, regulators, your employer, your peers, your audience, and yourself. Everyone wishes you knew more, and everyone (consciously or unconsciously) pushes you to pretend you do. It isn’t easy to hold onto your uncertainty, to insist you’re not sure.

To some extent, people’s response to uncertainty depends on their attitude toward the hazard, and on yours. Outraged people get more outraged when the experts’ reassurances come with uncertainty attached. “How dare you make me an unwitting subject in your risky experiment!” Apathetic people, on the other hand, get more apathetic when warnings are admittedly uncertain. “If even the experts aren’t sure it’s dangerous, why should I wear that uncomfortable respirator?”

Similarly, the pressure is usually strongest to make your reassurances more confident, to say “everything will be fine” when you really ought to be saying “we hope we get through this okay.” If your message is a warning, tentativeness is more tolerable. But not always. Consider the pressure on the intelligence community to make its WMD warnings sound more certain, to justify going to war. Or consider the pressure on industrial hygienists to make safety warnings sound more certain, so employees won’t have an excuse not to take the recommended precautions.

Finally, the cost of confident warnings that turn out false is lower than the cost of confident reassurances that turn out false. Not that it’s good to be seen as The Boy Who Cried Wolf, whose prophesies of doom are no longer credible. But it’s better than being seen as the management toady who hid serious risks and let people get hurt.

These asymmetries aside, the bottom line is the same for warnings and reassurances, for apathetic audiences and outraged ones. You will be tempted to suppress your uncertainty and sound confident. And things will turn out better if you resist the temptation. Over-confidence rings false, undermining everyone else’s confidence even if you turn out right. It provokes acrimony, making those who disagree much more contentious than they would be if you sounded less cocksure. And it devastates your credibility and your ability to lead if you turn out wrong.

Tips on Sounding Uncertain

number 1  Ride the risk communication seesaw. Someone is going to point out the uncertainties. Ideally that someone should be you, leaving us free to occupy the more confident seat on the seesaw. Acknowledge uncertainty up-front before you are confronted with it.

number 2  Try to replicate in your audience your own level of uncertainty. Tell people what you know for sure, what you think is almost but not quite certain, what you think is probable, what you think is a toss-up, what you think is possible but unlikely, and what you think is almost inconceivable. Put bounds on the uncertainty: What range of possibilities is credible? Clarify that you are more certain about some things than others.

number 3  Avoid explicit claims of confidence. Reserve the word “confident” for things you’d bet your mortgage on. “Hopeful” is a better word for desirable outcomes that are likely but not certain. And don’t imagine that hedge words let you off the hook. “At this point in time we are 100% sure the emissions are safe to breathe” is still unacceptably overconfident, despite the introductory phrase. Where appropriate, point out explicitly that you are not confident. Even better, predict that some of what you now think is true will probably turn out wrong. This is the best way to keep changing circumstances from being seen as earlier mistakes – or, worse yet, earlier lies.

number 4   Convert expert disagreement into garden-variety uncertainty. When experts disagree about a risk, the rest of us get very nervous. But faking an expert consensus that isn’t there is sure to backfire. Your best bet is to report everybody’s risk estimates, even those of your critics: “The company says X; the regulators say Y; the activists say Z.” By framing the risk in terms of the range of expert opinions, you avoid the risk communication worst case scenario: equally confident experts with antithetical judgments going head-to-head.

number 5  Make your content more tentative than your tone. Confidently telling us you could well be wrong inspires trust even as it alerts us to the genuine uncertainties of the situation. The reverse combination, claiming to be sure in a tone that sounds very unsure, is disastrous.

number 6   Show your distress at having to be tentative – and acknowledge ours. You wish you could be sure but you know you can’t; despite the uncertainties, you are able to make necessary decisions and recommendations. This models the reaction you want us to have. Show you are aware that the uncertainty distresses us as well as you and show you think we can bear it too.

number 7   Explain what you have done or are doing to reduce the uncertainty. Don’t perpetuate uncertainty; if there are ways to answer the question that you should be pursuing, say so, and pursue them. But if the remaining uncertainty is very small or very difficult to reduce further, say that. Don’t over-promise.

number 8   Don’t equate uncertainty with safety – or with danger. Never say “there is no evidence of X” when you haven’t done the study that tests the possibility; never imply that the absence of definitive proof that something is dangerous proves it’s safe; never imagine that risks are innocent until proven guilty. On the other hand, think hard before you buy into the widespread assumption that most industrial products and processes are hazardous whether the experts know it or not.

number 9  Explain how uncertainty affects precaution-taking. The greater the uncertainty, the more justified the precautions – not because you’re sure the risk is serious, but because you’re not sure it isn’t. How far to take this precautionary response is always debatable. Sometimes there is no reasonable choice other than bearing that uncertain-but-probably-small risk in preference to one that may be better understood but is probably more serious. Other times there are options that are both safer and less uncertain. Sometimes it is easier to reduce an uncertain risk than to measure it more certainly.

number 10  Don’t hide behind uncertainty. If the risk is probably significant, despite lingering quality control problems, say so. If it’s probably trivial, despite some unanswered questions and weird exceptions, say that. Most of the time, uncertainty should be something you discuss reluctantly, acknowledging that it weakens your case. If you find yourself asserting that uncertainty strengthens your case, pause and reconsider your case.

number 11  Expect some criticism for your lack of confidence. The only alternative is criticism for overconfidence, often from the same critics. That’s worse.

number 12   Don’t go too far. You could come across as bumbling, timid, indecisive, or terminally self-deprecating. But this problem is rare. The more common problem is coming across as arrogant and overconfident. Your goal is the middle, but you can safely aim for the opposite extreme. You won’t overshoot.

Take heart. The research on acknowledging uncertainty shows that it does diminish people’s judgment of your competence (though not as much as sounding certain and turning out wrong). But it actually increases people’s judgment of your trustworthiness! Since sources of risk information typically score higher on competence than on trustworthiness, this isn’t such a bad trade-off.

Being Precise about Uncertainty: How Uncertain Do You Want to Sound?

Absolutely certain statements about risk are almost by definition mistaken. Like any scientific statement, a risk statement must always be qualified in principle by the possibility of new data. The full range of science is from almost impossible to almost a sure thing. When the topic is a controversial risk, of course, most sound statements are closer to the middle. Sources who don’t want to say anything until they’re certain will seldom have anything to say. (See “It Is Never Too Soon to Speculate.”)

Saying you’re uncertain, in other words, is a good start, but it isn’t enough. Your goal should be to communicate as precisely as you can what you think so far and how uncertain you are.

Of course the best way to do that is with numbers. If you have a statistical estimate of the probability of some outcome, give it. Even without defensible statistics, numbers can help clarify your hunch – though of course it’s important to stress that a hunch doesn’t magically turn into valid quantitative data just because you choose to express it in numbers. (In other words, you need to be clear about how certain or uncertain your estimate of the uncertainty is!) Still, we live with numerical hunches all the time. Everyone knows guesstimated odds of 1-in-a-million mean it’s not going to happen; 1-in-a-hundred means it’s a real long shot but it could happen; 1-in-10 is still unlikely but nobody will be all that surprised if it happens; 9-in-10 and 99-in-100 and 999,999-in-a-million are similar seat-of-the-pants estimates at the other end of the probability distribution.

We all know what 50-50 means too – but in this case it’s helpful to clarify whether you’re saying that the evidence is pretty evenly divided, or that there isn’t any relevant evidence on which to base a judgment.

Absent numbers, you can do a lot with words. “We can’t prove X conclusively” sounds like you’re fairly sure X is true. “We can’t rule it out conclusively” sounds like you’re fairly sure it isn’t true. Both acknowledge uncertainty while sending clear signals which side you think the smart money is on.

“We have no evidence of X” also sounds like you’re fairly sure X isn’t true. This is a favorite locution of corporate spokespeople, and it is often used to mislead. If there have been no studies looking for a link between dimethylmeatloaf and pancreatic cancer, the claim that there is no evidence of such a link is technically accurate but intentionally misleading. If there have been lots of relevant studies and none of them found a link, you’d naturally want to back up your claim by saying so.

There are phrases to match any level of uncertainty. They may be wordy, but they communicate clearly enough. “The weight of the evidence suggests that X is likelier than not, but there is still plenty of room for doubt.” “We’re almost sure that X is not happening, and we are proceeding on that assumption, but we’re continuing to monitor the situation so we’ll be able to change course if it turns out we’re wrong.“ ”We think it’s probably X or Y; Z is less likely but still a contender; we’d be shocked if it’s anything else.”

I’d love to see more research on people’s actual responses to these and similar phrases. No doubt some phrases are genuinely ambiguous, provoking too wide a range of responses to be useful. Other phrases communicate better. But even without this research, being precise about uncertainty is easy enough to do. If you set out to clarify in the mind of the audience exactly what’s in your own mind, you can usually come up with language that does the job.

The problem is that we often don’t try. At least four biases keep us from trying.

The bias against sounding ignorant.

In some ways the most important uncertainty claims are the ones that provide no guidance whatever – except the crucial information that there is no sound guidance available. “I have no idea whether the side effects of the smallpox vaccine are safer or more dangerous than the risks of going unvaccinated. It all depends on the probability of a smallpox attack – and nobody has any data to estimate that.” “I have seen studies that suggest these industrial chemicals might cause birth defects in mammals, and other studies that show no such effect, even at high doses. The evidence is mixed and very confusing.” Statements like these are far rarer than they should be. Experts don’t like to sound ignorant, so they don’t often say, flat-out, that they don’t know something. They focus instead on what they do know, or think they know.

They are joined in this bias by journalists, who also prefer to report what is known rather than what isn’t. S. Holly Stocking of Indiana University has done a decade’s worth of research on “ignorance claims.” Some of it is online (do a Google search for “Stocking ignorance”) and it’s well worth reading. Reporters like to feel and sound in the know; they avoid reporting that they couldn’t find something out, that their sources didn’t seem to know the answer. (This is part of my rationale for being willing to speculate; if you decline to say anything because you’re not sure, reporters will find some less scrupulous source instead.)

The bias against being precise and assertive about uncertainty.

When risk experts and risk managers do acknowledge their uncertainty, they tend to do so vaguely. It’s almost as if the acknowledgment were somehow shameful, as if they felt they ought to be more certain. Maybe I should change the title of this column from “Acknowledging Uncertainty” to “Proclaiming Uncertainty.” That's what’s called for, really: a paradoxically confident assertion of your absence of confidence. Part of job is to be more precise about your uncertainty – what’s pretty sure but not absolutely sure, what’s a hunch but you wouldn’t bet the farm on it, what’s a wide-open question, what’s a long shot. And part of the job is to be precise with flair – to offer as ringing a declaration of your uncertainties as of your certainties. Maybe then journalists would publish the uncertainties too.

The bias toward sounding more certain than you are.

This is the core of the problem. Even when they do try to precise about their level of uncertainty, risk experts and risk managers tend to “round off” in the direction of greater certainty. If they’re sixty percent sure, they end up sounding ninety percent sure. Or they simply don’t talk about the things they’re only sixty percent sure of. At its worst, the result is a dangerous dichotomy – if it’s too uncertain to say with near-certainty, you don’t say it at all; if it’s too important to leave out, you say it with more certainty than it deserves.

Scientists rightly complain that reporters tend to make them sound more certain than they are (by downplaying caveats, for example). Scientists are less likely to notice that they also tend to make themselves sound more certain than they are, especially when talking to nonscientists. Many experts who are a model of tentativeness when they’re addressing their colleagues become markedly more dogmatic when talking to the rest of us. As Stocking points out, when most scientists sound overconfident, science itself can end up sounding contradictory, arbitrary, and self-serving. Two researchers looking at different aspects of the same problem report results that trend in opposite directions. They may know and respect each other’s work and share a sense of the complexity and uncertainty of the larger issue. But when they talk to the media they focus on their own findings, and what little they say about the uncertainties is less than quotable. So the news coverage may report each as if the other didn’t exist (two equal and opposite overconfident stories), or it may report both together as if they demonstrated a fundamental disagreement (converting scientific uncertainty into dueling Ph.D.s).

The bias toward over-reassurance.

Optimistic bias is a fundamental human characteristic. Research by Neil Weinstein of Rutgers University has documented that most people think they are less likely than others, and less likely than they actually are, to get cancer, get mugged, get fired, etc. There is a contrary tendency, dubbed “defensive pessimism” by Julie Norem of Wellesley College, but optimistic bias is the biggie. So risk experts and risk managers, like most of us, see the world of risk through rose-colored glasses. Likely bad outcomes tend to be seen as not as bad as they really are; unlikely bad outcomes tend to be seen as even more unlikely than they really are. Uncertainty, in other words, isn’t symmetrical. We don’t just come across as more certain than the situation justifies; we come across as more optimistic as well. This state of affairs lasts right up until the bad outcome is too obviously imminent to ignore. That’s when the general public switches to the other side and “over-reacts,” rehearsing for the crisis to come by imagining it is here already. And that’s when risk experts and risk managers experience the strongest temptation to keep on over-reassuring – no longer because they’re feeling optimistic, but because they’re under pressure.

The pressure to over-reassure comes from all sides – from management, from peers, from politicians, from affected industries, from the public itself, from one’s own inner fears. When we’re facing serious risks, they over-reassure us because they fear we might panic. When we’re facing tiny risks, they over-reassure us because they fear we’ll seize on any admission whatever to keep on worrying about nothing. I’ve written enough about over-reassurance elsewhere, so I won’t belabor the point here. (See for example “Fear of Fear” and “Worst Case Scenarios.”)

There is a fifth bias – on the public’s part, not the source’s – that is critical in deciding how to cope with the other four. When people belatedly find out that a company or government agency has been overconfidently over-reassuring, they become mistrustful and alarmed. In fact, even before they find out, people often react to overconfident over-reassurance in accordance with the risk communication seesaw; they smell a rat and they become mistrustful and alarmed. Overconfident over-reassurance routinely backfires.

Here's how it plays out.

Let’s say you’re charged with managing avian influenza in your corner of the world. One of the things you’re monitoring is human-to-human transmission (known in the field as h2h). There still aren’t any absolutely certain cases of bird flu h2h transmission. There are cases where someone with no known contact with birds came down with bird flu after nursing a relative who was sick with bird flu – but it’s always possible they encountered the feces of a passing bird. So you think h2h is probable but not certain. But you also think h2h is less dangerous than it sounds; a human flu pandemic would require not just h2h but efficient h2h, and so far there’s no evidence at all of that. You’re worried that acknowledging that h2h has probably happened already might unduly frighten people. So you accurately but misleading assert that there is no conclusive evidence of h2h. Some people are falsely reassured. Some sense that you’re trying to con them, and get more frightened instead.

Then (and this hasn’t happened yet, but odds are it will) more and clearer cases of h2h start cropping up. The people who were falsely reassured feel betrayed and get more frightened; the people who didn’t buy your reassurances in the first place feel vindicated and also get more frightened. Now you say what you should have said in the first place – that you’re not surprised that h2h can happen, it was probable all along, but it’s not too alarming to you as long as the cases are isolated and the transmission isn’t efficient. Because it contradicts your prior overconfident over-reassurance, this fails to convince. Mistrust, fear, and anger keep escalating. This begins to frighten and anger you too. Your poultry industry is suffering; you’re afraid people may be on the brink of panic; they’re ignoring what you consider the crucial information that so far the h2h is inefficient and therefore incapable of launching a serious human outbreak (let alone a pandemic).

Instead of blaming yourself for having said misleading things, you blame the media for sensationalizing and the public for over-reacting. This predictably increases the public’s alienation, mistrust, anger, and fear. That increases your disdainful conviction that people invariably over-react, which justifies your decision to be overconfident and over-reassuring, continuing the vicious cycle. Your ability to provide accurate reassurance that there is no crisis so far has been undercut by your own overconfident over-reassurances. And so has your ability to lead people through the crisis that may yet emerge.

So here’s a procedure for being precise about uncertainty:
Step 1 – Decide what you think and how uncertain you actually are.
Step 2 – Review the five biases and decide again.
Step 3 – Choose language that captures what you’ve decided.

Copyright © 2004 by Peter M. Sandman

For more on outrage management:    link to Outrage Management index
For more on crisis communication:    link to Crisis Communication index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.