Posted: January 15, 2008
This page is categorized as:    link to Outrage Management index
Hover here for
Article Summary My clients endlessly claim not just that the risk of X is tiny, but that anybody who thinks otherwise is “irrational.” This short column takes the irrationality claim seriously, and examines some alternative hypotheses. Even assuming your worried stakeholder is wrong about X, he or she may not be irrational – but rather mistrustful, postmodernist, cautious, uninformed, misinformed, intuitive, emotionally upset, motivated by personal or social values, or pursuing a different agenda. When we ignore these possibilities and assume our risk-averse stakeholders are irrational, the column suggests, we raise questions about our own rationality.

Who's Irrational?
When People “Ignore” Risk Data

This is the fifteenth in a series of risk communication columns I have been asked to write for The Synergist, the journal of the American Industrial Hygiene Association. The columns appear both in the journal and on this web site. This one can be found (more or less identical except for copyediting details) in the January 2008 issue of The Synergist, pp. 31–32, 34–35.

An employee named Sandy is worried that dimethylmeatloaf, a chemical Sandy uses on the job, might cause cancer. You review the data, and satisfy yourself that Sandy’s worries are groundless. But when you explain the data, Sandy is not reassured.

You’re tempted to decide that Sandy is irrational. The science says dimethylmeatloaf doesn’t cause cancer. If Sandy continues to worry that maybe it does, ignoring the data, then Sandy is irrational, right?

As a student of risk communication, you already know that calling Sandy irrational isn’t going to help. People take offense at being called irrational, and that usually makes them more upset, not less. So it’s irrational to call people irrational to their face. You get that, and you’re struggling to respond with empathy rather than hostility to Sandy’s irrationality. But Sandy is irrational, right?

Maybe. Maybe not. Nearly all my clients see at least some of their stakeholders as irrational. Inevitably, this creeps into their communications. Even if you manage not to say it out loud, doubting the rationality of the people you’re talking to affects how you talk to them.

So let’s look at some of the other possibilities.

Sandy may be mistrustful.

Sandy may not be ready to trust evidence that comes from you. That doesn’t have to be personal. It’s pretty rational to worry that the XYZ Corp. industrial hygienist might be under pressure to hew to the XYZ Corp. line about health and safety.

Or Sandy may be mistrustful of the risk assessment field itself. Many activists advance a coherent, thought-through, thoroughly rational viewpoint that the whole intellectual structure of risk assessment has developed in a way that favors industry’s interests (polluters’ interests) over the interests of workers (victims). The opposing viewpoint is also coherent, thought-through, and rational: that risk assessment is excessively conservative, bending over backwards to find risk even where none is apparent. Surely it’s not irrational for Sandy to lean toward the former hypothesis.

Sandy may be postmodernist.

Postmodernism is mistrust converted into a philosophical system. Postmodernist thought holds that there is no such thing as truth, only opinion and ammunition. Where you stand depends on where you sit. Competing claims are all fundamentally subjective (including this one). Believing anything else is the ultimate irrationality.

Postmodernism dominates universities, especially their liberal arts departments. Undergraduates quickly learn that everything is a matter of opinion. Employees who have adopted this philosophical stance are trendy, not irrational.

Sandy may be cautious.

Even classical philosophers agree that science hasn’t actually proved that dimethylmeatloaf doesn’t cause cancer – for three reasons.

First of all, science never proves anything for sure. As the data get better and better, scientists get more and more confident. But all science is tentative. As soon as you’re literally certain, you’re doing faith-based science.

Second, science is unable to prove a negative – not even tentatively. You can’t find evidence that dimethylmeatloaf doesn’t cause cancer. All you can do is try to find evidence that it does … and fail. The more often you try and fail – with different samples, different methodologies, different sorts of cancer, etc. – the more you tend to suspect that dimethylmeatloaf is probably okay. But negative evidence is never as convincing as positive evidence. (That’s partly why studies that didn’t find anything don’t usually find a publisher either.)

Third, toxicology and epidemiology are infant sciences with fundamental methodological drawbacks. Tox studies are rarely done on humans, so you’re stuck hoping that rats and earthworms respond to dimethylmeatloaf the same way people do. And epi studies usually have too small a sample, too low an exposure, too many confounding variables, and too short a timeframe to measure effects that are uncommon or delayed. So what we usually know is something like this: Even a lot of dimethylmeatloaf doesn’t seem to cause cancer in lab animals; a much smaller amount of dimethylmeatloaf doesn’t seem to cause cancer quickly in large numbers of humans.

I haven’t got space here to delve into the so-called Precautionary Principle. It’s certainly possible to carry “better safe than sorry” too far, and end up forgoing demonstrated benefits out of an excessive fear of just-barely-conceivable risks. But it’s also possible to carry faith in tentative negative findings too far, and end up doing irreparable damage just because the available evidence suggested it would probably work out okay. If Sandy’s on the precautionary side of this debate, that’s not irrational.

Sandy may be uninformed or misinformed.

I don’t want to dwell on this one, but it needs to be on the list. Sandy may be unaware of data you consider significant. Or Sandy may be relying on data you consider erroneous. Drawing the wrong conclusions from the right data is irrational (if that’s all that’s going on). Drawing the wrong conclusions from the wrong data is rational error.

Sandy may be intuitive.

Intuitive people aren’t “ignoring the data.” They’re paying attention to data their opponents think they should ignore – data that may be anecdotal, for example, or even unconscious. Most of us have learned to trust our intuition in some areas. We let ourselves decide to like or dislike other people, for example, without necessarily having good evidence to back up our judgment or even knowing what cues led us to make the judgment in the first place.

Many scientists trust their intuition about scientific questions as well. You can’t spend much time with scientific experts without hearing about hunches that go way beyond the available evidence. An expert’s intuitive hunch about the carcinogenicity of dimethylmeatloaf is less reliable than a data-based conclusion, but it’s more reliable than a coin flip. Is it irrational for scientists to take their intuitions seriously? Is it irrational for Sandy to do the same?

Sandy may be emotionally upset.

“Ah hah!” you say. “Surely Sandman has to admit that emotion is the opposite of rationality.” Not so.

Arousing emotion is fundamental to alerting people to risks and persuading them to take useful precautions. Addressing emotion is fundamental to reassuring people about risks and deterring them from taking excessive precautions. Whole people, healthy people, respond emotionally to risk. Rationally appropriate and rationally inappropriate risk responses are both propelled (or at least influenced) by emotion. If Sandy is upset about a risk that isn’t actually very dangerous, that’s a problem. But the problem isn’t that Sandy is upset. The problem is that Sandy chose the wrong risk to be upset about.

And are you sure that it’s always irrational to get upset about a small risk? What I call “outrage” determines whether a risk is likely to upset people or not. Risks that are coerced, unfair, and morally offensive – to choose just three of many outrage factors – arouse more fear and anger than risks that are voluntary, fair, and morally acceptable. Do you really think that’s irrational? Do you want to live in a world where coercion, fairness, and morality are ignored? Or is it better to live in a world where such factors are allowed to influence which risks we tolerate and which we rise up against?

Imagine that Sandy were suddenly incapable of responding emotionally to risky situations, and incapable of caring whether a particular risk were voluntary or coerced, fair or unfair, moral or immoral. Is this new Sandy a safer employee? A better or more rational person? I don’t think so.

Sandy may be motivated by personal or social values.

Like emotions, values influence how we respond to risks. Consider genetically modified foods, for example. Some people oppose GM foods because they believe there may be health or environmental risks that outweigh the potential benefits. Other people oppose GM foods because they believe it’s wrong to play God by creating creatures God didn’t create. Is the second group irrational? Suppose members of the second group are influenced by their religious values to interpret the evidence on health and environmental risks very cautiously. Does that make them irrational?

If you’re tempted to answer “yes” to these two questions, you probably don’t have religious objections to genetic modification. Then consider some of your own deeply held values instead. Ask yourself two questions. First, do your values influence how you respond to data? (Assuming you disapprove of child abuse, for example, how would this affect your reading of a study that seemed to show most children recover?) And second, do you still think letting your values influence how you respond to data is irrational?

Sandy may have a different agenda.

What looks like an irrational response to X is sometimes a rational response to Y. Maybe dimethylmeatloaf smells bad; Sandy knows that won’t cut any ice with management, and decides to gin up a cancer concern instead. Or maybe Sandy hopes to use the dimethylmeatloaf controversy to organize a union. “Hidden” agendas may be devious (though we all have them), but they’re not irrational.

Who’s irrational?

Once in a while you may run into people who focus exclusively on the scientific data and nonetheless draw conclusions that the data don’t support. You explain why their reasoning is mistaken. They understand what you’re saying but stick to their prior conclusions anyway. Okay, maybe they’re irrational.

That’s different from encountering people who mistrust the data; or people who interpret the data cautiously; or people who have the wrong data; or people who supplement the data with intuitions, emotions, values, or other agendas. These people are not irrational.

They may be wrong about the data. Or they may be right about the data (and you may be wrong). Or the data may be mixed, ambiguous, and hard to interpret, leaving it an open question who’s wrong and who’s right. More importantly, they may be wrong or right about the overall situation; or the broader questions, too, may be open to debate.

And the data are unlikely to offer a definitive answer to the broader questions. Scientific evidence is usually relevant to important questions, but seldom determinative. That’s why it matters how much we trust the data; how cautiously we choose to interpret the data; and what role we assign to intuition, emotion, values, and other agendas.

In the face of all this, what should we call someone who insists that everybody should ignore questions of trust and caution, intuition and emotion, values and other agendas … someone who ignores the data showing that people’s decisions are almost always based on more than just data … someone who labels as “irrational” anybody who sees the data differently and anybody who looks beyond the data? It’s tempting to end this column by saying such a person is irrational.

But the truth is even more paradoxical than that. Why are we all so often tempted to think that people we disagree with are irrational? Usually it’s because of our own intuitions, emotions, values, and other agendas. As a rule, disputes over data are about a lot more than the data – on both sides.

Sandy’s mistrust may make you angry. Sandy’s worries about dimethylmeatloaf may get in the way of company productivity goals. Sandy’s allegiance to the Precautionary Principle may be antithetical to your values about risk-benefit analysis. In short, the reasons why Sandy may continue to see dimethylmeatloaf as dangerous (despite the data) are pretty much the same as the reasons why Sandy’s industrial hygienist may continue to see Sandy as irrational (despite this column).

Sandy may be wrong about dimethylmeatloaf. You are very likely wrong about Sandy. Not irrational. Just wrong.

Copyright © 2008 by Peter M. Sandman

For more on outrage management:    link to Outrage Management index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.