Risk Communication
Notes from a class taught
by Dr. Peter M. Sandman

These are the notes I took at a seminar at the Hanford Nuclear Reservation. Posting this report (for my bosses who sent me) about Peter’s seminar on my personal website led, directly if slowly, to my position as Peter’s webmaster.
To read how our paths intersected, read his essay:
Muddling My Way into Risk Communication – and Beyond

To get in touch with Peter M. Sandman, email: peter@psandman.com

For Peter’s columns, articles, and handouts; his book and videos;
and more information, please go to his risk communication website.

Introduction

Risk communication comprises two facets: ”scaring people” and “calming people down” or alerting and reassuring people. There are moderate hazards that people are apathetic about or minor hazards that people are outraged about. Risk communication tries to create a level of “outrage ” appropriate to the level of hazard.

Risk communication differs from risk assessment in that risk assessment deals with the physics and chemistry and probability of something happening. Risk assessment defines risk as magnitude (how bad the problem could be) times the probability (how likely it is to happen). Experts tend to focus on this definition (let’s call it hazard), and so underestimate actual risk, because they ignore outrage. The public tends to focus instead on outrage and pay less attention to risk (hazard).

Dr. Sandman’s contention is that Risk = Hazard + Outrage and he says “If people are outraged because they do not understand the hazard, educate them about the hazard. If they are outraged and DO understand the hazard, you must address the outrage. ‘Educating the public’ is not sufficient to deal with public outrage.”

The goal of most outrage is to shut the proposal down (not build an incinerator, not site a chemical factory, etc.). The outrage related to clean-up is different: not to shut it down, but they want a different or cheaper or better clean-up. Outrage builds the clean-up budget, but reduces flexibility.

Risk communication should not be in the Communications Department, says Sandman. Because 98% of communications work is directed toward an audience that is inattentive but convincible, the work of a Communications Department is mostly getting the audience’s attention and marketing to them. The 2% of communication that is risk communication has an audience that is very attentive but hostile. The strategies of traditional communications groups backfire in the risk communication arena.

Twelve Components of Outrage

There are 12 components of outrage that Sandman suggest need to be dealt with. These are the perceptions that should be addressed to lower community outrage.

 

Discussion of the 12 outrage components

Voluntary vs. coerced

Voluntary risk is as much as three orders-of-magnitude more acceptable. The “right to say no” makes the risk seem smaller. At Hanford, most people are here voluntarily, so the outrage is lower. In Spokane, the association with nuclear risk is not voluntary, so the risk is perceived as less acceptable.

Natural vs. industrial

Natural is perceived as less risky (even when it isn’t). Natural and industrial risks are measured on different scales. There’s a quite high radon risk in Northern. NJ from the breakdown of uranium in the granite substrate. People there are generally apathetic, it’s “natural” radon. There’s a nearby township where the houses are built on an old landfill that has (thorium) tailings from a clock painting factory. The same levels of radon are found in their basements, but people are upset about it “it’s not ‘natural’ radon” – (There’s also a component of “you can’t sue God.”)

Hanford runs past industrial all the way to anti-natural. The stuff we have here doesn’t ever exist in nature. It’s a problem for our (various) publics when we treat it as if it’s a natural risk. And comparing Hanford radiation to background radiation won’t suffice: people DON’T see it as comparable “background radiation is natural radiation, vs. Hanford’s non-natural radiation.” Comparing a low-hazard, high-outrage risk (Hanford Site) against some low- or medium-hazard, low-outrage risk backfires.

Familiar vs. not familiar

Familiar is less risky: employees get so familiar with risk at their outrage goes down (and sometimes so does their safety). If you can get the public familiar with the hazard, their outrage may go down.

Not memorable vs. memorable

Not memorable is better. How easy is it to visualize something going wrong? The media creates memories. There are signals of risk: odors, guards, alarms, plumes, warning signs. Signals increase memorability. So do symbols: the 55-gal drum is a symbol of the chemical industry, the cooling tower symbolizes the nuclear industry. The outrage is independent of the hazard (cooling towers aren’t dangerous) – but they create an emotional response (outrage) in the viewer.

Wherever possible, avoid memorability. And it’s VITAL to acknowledge the memorability that already exists. It doesn’t matter whether it’s relevant to the hazard. Three Mile Island (TMI) is memorable. Nothing you do will decrease its memorability.

Not dreaded vs. dreaded

Not dreaded lessens outrage. HIV and cancer are the most dreaded diseases in the U.S.

Elenor’s comment: High-blood pressure is a MUCH more common risk, yet try to arouse people’s interest in addressing that disease; versus the much less common HIV and cancer, which present a much lower risk, but are attended by tremendous “outrage” and so get the funding, the activists, and the interest and concern of “regular” people.

The vector of the perceived risk is important: Contaminated water is perceived as worse than contaminated air, contaminated air is worse than touch, touch is worse than food. Waste is perceived as more dangerous than raw material. If there are drums marked hazardous waste, they are perceived as more dangerous than drums (with the same hazard level, the same risk) labeled hazardous raw material.

Elenor’s comment: It doesn’t MATTER that it’s not logical – it’s real!

Acknowledge the dread and legitimate it. If there’s an oil spill – don’t keep talking about how low the toxic effect will be – people are saying “it’s disgusting.” You must confirm their perception “yes, it’s disgusting, but the hazard is low.” If you’re busy disagreeing with their perception, they will mistrust whatever you tell them about the hazard. By agreeing that it’s disgusting, you change the focus from the disgust to the hazard.

Chronic vs. catastrophic

Chronic is perceived as less dangerous. This is why car-related deaths are consider less newsworthy than airplane-related deaths. The car deaths are spread out over time and location. Companies too often focus on making the probability of occurrence lower, and not on reducing the hazard. A low probability times a high hazard will equal high outrage. You must focus on reducing the severity to lower the perceived risk. People want the probability to be lowered, but especially the severity (magnitude) of the hazard reduced.

Companies must talk honestly about the worst-case scenario. This goes against all company desires. They want to downplay the worst case because it’s (one hopes) least likely. To gain the trust of the public you must address worst case. A chemical company fought against meeting with the public and talking about the true worst-case. Against their desires, they did it: they told the public what the worst, worst, worst case could be, how many 1,000s of people would be killed and injured and all the likely damage. The public and the activists agreed it was the worst case, but since it was so very unlikely, they wanted to talk about the more likely, less severe cases. Instead of mistrust and accusations of covering up the worst possibility, the company and the public were about to work together on the more possible cases. (If the company cannot be trusted to come up with an ACTUAL worst case scenario, how can the public trust the company to protect against it?)

One BIG problem that’s coming up now, and is nearly always ignored in risk assessments is internal risk. As companies work to reduce risk by increasing the training and engineering to prevent accidents, this increased knowledge also allows for better sabotage. If you teach your employees to never close valves a, b, and c because doing so will cause an explosion, you’ve just taught everyone how to create an explosion. (Outraged employees are becoming a major hazard, and most risk analyses ignore that risk.) (Notice all the Hanford Site Safety Analysis Reports and assessments also ignore the possibility, and in a time of lowered morale and increased stress, that’s very bad!)

Knowable vs. unknowable

Knowable is better. The public is much less tolerant of uncertainty than are engineers or scientists. The public prefers a lower “highest possible damage” with a higher likelihood of occurring to a higher “highest possible damage” with a lower likelihood of occurring.

Elenor’s comment:This may not be logical but it’s SO! You must recognize that rationality doesn’t matter – how the public perceives the risk is what you must address (outrage, not hazard).

Part of the “knowable” component is detectability. It would be so much easier if radioactivity were purple. At TMI, it was the first time anyone had seen reporters hurrying a press briefer – “hurry up and get us further away from here.” When asked why, after all, the reporters had been in wars and earthquakes and riots, they were in such a rush to get away from TMI, a reporter said, “At least in a war, you know you haven’t been hit yet.”

To address this outrage component, make the risk more detectable (if you dare). An example is the siting of a chemical incinerator. In working with the public to get them to accept the incinerator, the public asked the plant to put a 7-ft neon sign on the plant roof, attached to the thermostat in the stack. As long as the temperature stays above a certain level, the toxins would be burned and not released. The public was worried that the plant would let the temperature go too low. This detectability reduces outrage, and therefore reduces opposition.

Elenor’s comment: In the seminar, immediately after Dr. Sandman said this, classmates who were engineers at Hanford objected: “what if you had an electrical problem and the sign went out?” Sandman laughed and said “always, uniformly, the scientists or engineers in his seminars come up with lots of little surmountable technical problems that would prevent the solution to address outrage. They could have a second sign as a back-up, or a way to signal the difference between a thermostat/temperature failure and an electrical failure.”

There was a refinery in NJ that the local public was incensed over. The plant, working with the community decided that if any non-normal condition (a flare in the burn-off stack, an alarm, a truck arriving – any unusual thing that might attract attention) occurred, the plant would call the local police desk. The public could call and find out what had “gone wrong.” Over some several months, the plant called the police station 19 times to explain something that had occurred. One day there was a terrible, acrid odor in town, and the people called the police. The desk officer said, “no it couldn’t be the plant cause they hadn’t called.” So, 19 phone calls had created a level of trust that had not previously existed between the people and the plant. (Now, this is real important: if the smell HAD been connected to the plant and they hadn’t called with the explanation, they’d have blown it big-time, but the smell was not related to the plant. The people were comfortable that if it had been, the plant would have admitted it.

Individually controlled vs. controlled by others

Individually controlled makes people feel a sense of being safer, even if they’re not. (Consider: Who implements the voluntary choice). Chauncy Star says: “imagine you’re carving a rib roast, but you’ve got no fork. You put your hand on top of the roast and start carving. How close to the knife do you put your hand? Now, imagine it’s a two-person job. How close do you put your hand to the knife someone else is wielding?”

The public is holding the meat, the company the knife. It’s really hard to both disempower people and reassure them at the same time. The solution? Share the knife. Share control with the public, through advisory committees, public representation on the board, and so on. Companies HATE this, but they will do it because of profit – they need to make a profit and so they need to mollify the public. Govt agencies don’t/can’t make a profit; they HAVE only control, so getting them to share the knife is extremely hard!

To address this: if you have control, share it. If you don’t have control, SAY so! A utility was being picketed to build their new high-power lines away from a housing area. Sandman asked them: COULD they route the lines around the houses? The answer was no for several reasons: they were not legally allowed to because they had to follow state law, which does not (yet) address the possibility of high-power lines being a health hazard. Also, their shareholders would not allow them to spend the extra million dollars it would cost. In fact, they could ONLY route the lines where they were planned (and where the public was against). It was very hard for the utility to admit in public that they HAD NO CHOICE in siting the lines, but admitting it reduced the public outrage.

Fair vs. unfair

The distribution of risk and benefits must be fair (or mitigated). Unfair risk is big risk: That is, if the people who are in jeopardy from the hazard are NOT receiving a benefit from being near the hazard, then the outrage will be very high (and high outrage equals high risk).

Hanfordites are getting benefits from being near the Site (jobs, booming town, good $), so for them, the outrage is low. Local farmers, who don’t see as direct a benefit, have higher outrage. You must find ways to make the benefit proportional to the hazard. If you cannot make the benefit proportional, go for mitigation. Don’t just build a park for the people affected by the hazard – ask them what they want: you’ll end up building the same park, but the people will see it as mitigation (and as company responsiveness), and not random (and unconnected) largesse. This allows the company to connect fairness to individual control: if the company feels a little blackmailed into doing something for mitigation, then they are probably sharing control, and thus creating fairness.

Morally irrelevant vs. morally relevant

Pollution used to be unimportant. Now, pollution is morally wrong, and polluters are reprehensible. Once the public decides something is “wrong” (morally), then the language of trade-offs becomes insufficient. Now it’s not acceptable to pollute only a little, you must be TRYING to (not necessarily succeeding at) avoid polluting at all. You may not reach zero, but your goal must be zero. You can be right on the data (low hazard) but the moral value prohibits acceptance.

Trustworthy sources vs. untrustworthy sources

Trustworthy sources are necessary. You need to build trust. Anytime you acknowledge a problem, you build trust. If you deny problems, you destroy trust. While working to build trust, you must NOT ask for it. If you ask for trust, you create mistrust (who always says “trust me”?)

The replacement for trust is accountability. But to whom?

Regulators (except no one trusts them!) It’s important, however, if your regulators catch you up on something, if you get disciplined by regulators, you MUST publicize it! And YOU must publicize it! Let the public KNOW that your regulators are on the job, and that will reduce outrage by increasing the public trust of the regulators. Sure it’s embarrassing, but do you want to reduce mistrust/outrage or not?

General public (advisory panels etc.) Your public may not trust them, may think that they’re well meaning but likely to be hoodwinked by the company or they could be co-opted by the company.

Activists. This is the best group! (no, really!) The company says to their public “you don’t have to trust us, you can trust our enemy.” Make the activists the cops who keep you honest. Yes, you must let the activists “win” against you, you must give up the moral high road and let the public think you’re only doing the right thing because you’re being forced to, but they won’t believe you’d do it unless forced to ANYWAY. By losing face, you gain support. You need your public to believe you’re doing the right thing – do you need them to believe you’re doing the right thing because you choose to rather than you’re forced to? Sure it feels better to be trusted – but YOU’RE NOT TRUSTED!

Responsive process vs. unresponsive process

Responsive process has four facets:

1.  Secrecy vs. openness

Almost all the risk communication crises Dr. Sandman deals with are based in secrecy.

2.  Acknowledging vs. stonewalling on wrongdoing
  • A two-yr-old spills juice and says “it was an accident!”
  • An adult spills juice and says “oh dear, I’m so sorry!” and the HOST says “it was an accident.”
  • An oil company spills “juice” and says “it was an accident!”

What do you think the public says?

About 6 months after the Exxon Valdez (Alaska) spill, a ship carrying oil on contract for BP ran aground and spilled at Huntington Beach (rich enclave), California. The BP CEO flew to the spill, and had obviously planned his risk communication carefully. When he was asked, “whose fault was this spill?” you could see he wanted to say “look, it was a contract ship, with a contract crew. They spilled our oil!” But instead he said: “My lawyers say this was not our fault, but I feel as if it were our fault, and we will deal with it as if it were our fault.” Six months after the spill, they polled the residents, and BP had a HIGHER approval rating than before the spill.

Compare: How do people STILL feel about Exxon, ten years after?

3.  Courtesy vs. discourtesy.

Even though your public may be angry and impolite, you must never return discourtesy or you will create more outrage.

4.  Compassion vs. dispassion.

When dealing with a situation, there comes a point where you must stop dealing with the hazard and work with outrage. There’s a common misperception that engineers and scientists and technologists can’t do it – they retreat further into the tech specs, rather than deal with the emotionalism. But if an engineer’s 18-yr-old daughter comes home from college in tears because she broke up with her boyfriend, the engineer doesn’t say “now, honey, you must realize that the median teenager has an average of 3.7 breakups over the 4 years of college attendance.”

Teach your engineers when (and HOW) to address outrage, not just by throwing more technobabble at the outrage.

The solution to hazard is mitigating the hazard.
The solution to outrage is NOT ignoring it and NOT mitigating the hazard.
You must mitigate the outrage!

Reducing outrage

There are five things you must do to reduce outrage in the event of a problem:

(Exxon is going to do some expensive penance courtesy of the courts, but that still will NOT restore them to good graces. It’s about outrage, not the technical facts. In 90% of court cases, you ask right before they go into court, and the plaintiff says “the b*stard never even apologized!” Lawsuits are almost always about outrage. (Only lawyers think they’re mostly about greed.)

Four levels / four rules for public involvement

Four levels:

There are four levels of public interest (on any particular topic):

Level 1   Fanatics to whom it’s one of their top 2–3 issues and who learn all they can.

Level 2   Attentives to whom it’s one of their top 10 or so issues, and who will read an entire newspaper article on the subject.

Level 3   Browsers to whom it’s one of their top 100–200 issues and who will read the first couple of paragraphs of a newspaper story on the issue.

Level 4   Inattentives who are not interested and are willing to rely on the fanatics and attentives to call their attention to it if warranted.

Four rules:

Rule 1   Forget the inattentives. You cannot reach them.

Rule 2   To reach browsers, and to some extent the attentives, use the media. The media are a one-way communication. You must keep repeating the basics over and over because the browsers only catch bits at a time. Even when you think EVERYONE must know the basics by now, keep repeating them. Browsers are the principle target of the media.

Rule 3   Your most important public are the fanatics. You cannot make them browsers. You cannot turn their interest. You must deal with them as they are.

Rule 4   The key to public involvement is the permeability of the boundaries between the four types. A good public involvement program helps people cross the boundaries. If you can mollify the fanatics, by involving them as watchdogs and respectfully and carefully addressing their outrage, you may drop their level of mistrust to that of an attentive. However, by mistreating the attentives, you may drive them over into your loyal group of fanatics (this is NOT to be desired!).

And, finally:

If you ask “what would you do about this hazard if it were real important to you?” and they respond with stuff like “oh, you can’t change that, you can’t reach the company, they never listen,” and the like – your outreach program is not working. If they say “well, I’d call the company hotline and they’d give me information and put me on the advisory committee if I asked” – good work! That makes the boundaries permeable – fanatics, once their outrage is handled, can slide back to be attentives. Attentives, if something gets their attention, can become fanatics and approach the company and solve their outrage.