Posted: January 2, 2010
This page is categorized as:   link to Outrage Management index   link to Pandemic and Other Infectious Diseases index
Hover here for
Article SummaryThe National Public Health Information Coalition is an organization of federal, state, and local health department communicators. NPHIC asked me to give its 2009 “Berreth Lecture” at its annual conference in Miami Beach – and specified that the presentation should be about myself and my career, not the substance of risk communication. But as I walked the group through my 40 years in risk communication, a substantive theme emerged: that public health communicators are at least as untrustworthy as corporate communicators, that nobody has the courage to trust the public with those parts of the truth that conflict with the message, and that public health agencies need to learn how to cope better with mistrust and outrage. I illustrated my thesis with a lot of flu and other infectious diseases examples.

Trust the Public with More of the Truth:
What I Learned in 40 Years
in Risk Communication

The 2009 Berreth Lecture, presented to
the National Public Health Information Coalition,
Miami Beach FL, October 20, 2009


W języku polskim: Zaufać społeczeństwu, mówiąc mu prawdę:
czego nauczyłem się w ciągu 40 lat pracy w sektorze informowania o ryzyku
  link is to a PDF
(published in Bezpieczeństwo i Technika Pożarnicza (Safety & Fire Technique), December 2010)

I wrote this speech out in advance, something I almost never do. But I departed from my text more than a little, so I have posted the audio link is to an audio MP3 file as well. A video of my speech Link goes to Vimeo page is also available off site.

It’s an honor to be asked to give the Berreth Lecture – and just to be asked to speak to NPHIC….

They asked me to make this personal, not substantive – no handouts, no overhead projector (no PowerPoint slides), no need to take notes. I’ll do some of that tomorrow in the all-day seminar on radiological risk communication; that will be basically an overview of three paradigms of risk communication using radiation examples.

I want to speak on two themes today. The first is how I have muddled my way into (and through) risk communication. The second – because it’s hard for me to imagine people being much interested in the first – is something I learned along the way that I think is relevant to the work we all do … and perhaps especially relevant to the work we are doing right now with regard to this H1N1 pandemic – the first influenza pandemic in my professional life, and I assume in most of yours. (I was in junior high in ’57 and grad school in ’68; I don’t remember either pandemic.)

What I learned, in a nutshell: Nobody’s trustworthy. We should work harder to be trustworthy, and we should stop expecting (or seeking) to be trusted. We should all aim for accountability instead. That turns out to be a lesson “good guys” (like public health professionals) have a tougher time learning than “bad guys.”

When I work with corporate executives, they may think they’re honorable and trustworthy (even when they’re shading the data in the direction of their corporate goals) but they certainly don’t expect to be trusted. Public health officials, by contrast, do expect to be trusted (even while they’re shading the data in the direction of their public health goals) … and that keeps getting them into trouble.

So that’s where we’re going. But let me start with some of where I’ve been.

number 1 In 1970, I was just starting on my doctoral dissertation on why book publishers publish so many books they know in advance will lose money. (Does that sound as dull to you as it now does to me?) Ronald Reagan was governor of California, Richard Nixon was President. Reagan invited Nixon’s Secretary of Health, Education, and Welfare, Robert Finch, to give a speech on the Berkeley campus, which was very much a center of student unrest. (I was down the road at comparatively placid Stanford.) When student protesters interrupted his speech, Finch complained, “You’re students! Why don’t you do research instead? Why aren’t you sending research proposals to HEW on these issues you care so much about?” An aide apparently whispered into Finch’s ear that HEW didn’t fund student proposals. “Send your proposals directly to my attention,” he added.

So three other grad students and I ginned up a proposal to study various aspects of environmental communication as our dissertations. Mine was on what I called “ecopornography” – how dishonest environmental appeals in product advertising affected people’s environmental values. It was funded, and I started becoming an environmental communication expert.

number 2I got my Ph.D. and my first full-time teaching job, in the Journalism School at Ohio State University. And, as newly minted Ph.D.s do, I started publishing stuff based on my dissertation, and doing research extending my dissertation. Pretty soon I had a name as a big fish in a tiny pond called environmental communication. In 1976, I got a call from Bill Stapp at the University of Michigan School of Natural Resources. They were starting a graduate program in environmental communication, Bill told me, to meet the needs of all those Earth Day alums looking for ways to save the environment. Would I be interested in starting and running the program? I went to Ann Arbor to interview.

There were two search committees, one faculty and one student. During my interview with the student search committee, Mike Schectman asked me to describe my commitment to the environment. I said I didn’t have any. I figured my students would supply the commitment; I would supply the communication expertise. But I’d probably get committed, I added, since that’s how cognitive dissonance works. They’d come to understand that when they started learning how to build other people’s environmental commitment. I learned much later that there was a big debate after I left over whether it was okay to hire a prof who wasn’t a true believer.

But I was hired. And I became an environmentalist, sure enough. In the next five years I helped train a generation of environmental activist propagandists on how to arouse environmental concern. Thirty years later some of them are running major environmental groups. What I was doing in those days is what I now call precaution advocacy: high-hazard, low-outrage risk communication.

number 3 Most of my work was with environmental activist groups, but I worked with other good guys as well. I got involved with the American College of Physicians, for example, and did several papers for their conferences and articles for the Annals of Internal Medicine on how persuasion theory can help public health. One 1977 paper at the American College of Physicians annual meeting in Dallas was entitled: “The Swine Flu Fiasco: What We Did Wrong.” It’s on my C.V. I only wish I could find my notes (doubtless on yellow legal pad in those pre-computer days).

number 4 I figured I’d spend my life helping good guys arouse concern in apathetic publics. Then came March 1979, and the Three Mile Island nuclear power accident. Since I had published extensively on environmental communication/journalism/PR/etc., the Columbia Journalism Review called and asked me to go to the site, follow the reporters around, and cover the coverage. Shortly after my Columbia Journalism Review cover article was published, I was asked to work for the Kemeny Commission, appointed by President Carter to investigate the Three Mile Island accident. My piece of the puzzle was to help figure out who knew what when: how the story was flacked by all sides, from antinuclear activists to Metropolitan Edison to the Nuclear Regulatory Commission. (It’s the only time in my life I ever had subpoena power.)

My work for the Kemeny Commission contributed to recommendations and then regulatory requirements about how utilities needed to be prepared to communicate during an accident. So I started getting requests from nuclear utilities to come consult on how to obey the new regulations. I didn’t want to lose my virginity by working for the bad guys. I said so. They said I helped write the legislation; shouldn’t I want them to know how to obey it? I didn’t have an answer. So I did some of my first corporate consulting.

number 5 As a result, a local activist from the Three Mile Island area had me blackballed from the list of academics who help antinuclear and environmental groups. I came away thinking working for industry was dangerous to my activist career – but also thinking industry is more organized by what you do, know, and think; activists, by contrast, mostly want to know whose side you’re on.

This has happened a lot in my career, and never in the other direction. Industry likes my activist credentials. Activists hate my industry work. I am reminded of the forms I have to sign every time I work for the World Health Organization, swearing that I haven’t worked for the tobacco industry. I doubt the tobacco industry makes people swear they haven’t worked for WHO. The bad guys will work with anyone who can help them. The good guys are likelier to demand loyalty oaths.

number 6 I also came away from Three Mile Island with a new intellectual frame. Until then, I had done virtually all my work on issues where people were insufficiently concerned about serious risks. Now I had a sense that sometimes people are excessively concerned about small risks. And sometimes people are appropriately concerned about serious risks. (Three Mile Island turned out to be a small risk, but it sure looked like a serious risk at the time … even to the Nuclear Regulatory Commission, which worried that a hydrogen bubble in the containment vessel might explode.)

I understood early that the problem wasn’t panic; there was very little panic at TMI, though officials and journalists kept imagining that the public was panicking. I came to understand that that happens constantly in crisis situations: People are anxious for good reason and officials think they’re panicking.

As a result of Three Mile Island I began to be interested in a more symmetrical frame for my risk communication work: not just how do you alert apathetic people, but also how do you reassure over-alarmed people and how do you guide appropriately alarmed people.

number 7 Still, I went back to working with the good guys. I had plenty of new examples of corporate dishonesty from my Three Mile Island work to reinforce my sense that I wanted to stay on the other side. My favorite example: Metropolitan Edison’s PR guy, Blaine Fabian, had announced on Day One of the TMI accident that “The plant is cooling according to design.” I asked him how he could say that, when he knew a valve was stuck open, a hydrogen bubble had developed in the containment structure that was threatening to explode, all sorts of things had gone wrong. “You know,” he told me, “that’s what’s so wonderful about nuclear power plants. They’re designed to cool in emergency conditions, even when all sorts of things are going wrong. So, yes, we had a lot of malfunctions. Nonetheless, the plant was cooling according to design.” He downplayed the malfunctions without actually lying.

number 8 But I kept noticing that the good guys weren’t exactly honest either. In 1981 I started doing communication work with the American Cancer Society. One of the big ACS activities, then as now, was corporate smoking cessation programs. In order to help sell these programs to companies, we commissioned an economist to do a study of the economic impact of employee smoking on companies. We expected to show a big cost due to medical expenses. Instead, the study came out showing that employees who smoked actually saved their company money (pension money and healthcare money) by dying more rapidly after retirement. It simply wasn’t in a company’s economic interests to support smoking cessation.

What do you think we did with the study results? We suppressed them, and continued to tell companies they would benefit economically from sponsoring ACS smoking cessation clinics for their employees. I argued for candor, or at least for dropping the false argument. I lost. I was told pretty explicitly that public health was a higher value than transparency. (Metropolitan Edison had needed to find a way to mislead without actually lying. The American Cancer Society was okay with just lying.)

number 9 Similarly, from 1982 to 1984 I did communication work for the nuclear weapons freeze movement. (The nuclear weapons freeze was my biggest issue in the mid-eighties, the way the flu pandemic is now; I took a sabbatical to work on the freeze fulltime.)

The Reagan administration was working to deploy cruise missiles in Europe. Because they’re small, cruise missiles are hard to detect, and since detectability is essential to a viable freeze, freeze activists saw cruise missiles as a major threat to our movement. So the national freeze leadership developed messaging that said cruise missile deployment would make a nuclear weapons freeze impossible. It was already a near-certainty that cruise missiles were going to be deployed. I asked the leadership if we were planning to quit after that happened. “Of course not,” they said. “Then what do we mean when we say cruise missile deployment would make a freeze impossible?” “That’s just messaging. We don’t have to mean it.” I argued that the message wasn’t just dishonest, it would also undermine the credibility of the movement. I lost.

Nobody quite said so, but the sense I had was that the freeze leadership saw the battle as basically about what side you’re on, Reagan’s side or the freeze’s side. We thought we were trying to win over less lefty people to the cause of a nuclear weapons freeze – but all too often what we were really doing was standing tall for our values. (When we did win over conservatives, we often treated them badly. They weren’t really part of our crowd. It was almost embarrassing to have them on our side.)

number 10 The three years from 1985 to 1987 were pivotal for me in a lot of ways. For one thing, 1985 was the year I met Jody Lanard, who later became my wife and partner. (She’s here today.) Jody was a psychiatrist, trained in family systems theory. Pretty soon she was giving me comments on all my work – and my work got a lot more shrinky as a result. Some of what are now my signature concepts, like the risk communication seesaw, originated with Jody. Much later, during the SARS outbreak, Jody started doing risk communication work on her own, especially on infectious disease outbreaks and other emergency situations in developing countries.

number 11 Also in 1985, a brand new radiological risk surfaced (literally): radon. A decay product of uranium in the soil, radon rises through the rock and soil until it surfaces – and if it happens to surface under your well-insulated house, it can accumulate to dangerous levels. Radon is thought to be the second leading U.S. cause of lung cancer after smoking. It’s especially big in Pennsylvania and New Jersey, both on the Reading Prong of uranium-bearing rock. (Interestingly, scientists at the Pennsylvania Bureau of Radiation Protection had been planning to investigate the risk of radon in 1979, but were distracted for half a decade by Three Mile Island.)

When radon first surfaced, New Jersey and Pennsylvania environmental health officials were convinced the problem was going to be panic – excessive alarm. I thought apathy was going to be a more common problem, and Neil Weinstein and I did some quick research to confirm that. (I was reminded of radon this past year, working with WHO, the CDC, and others on the H1N1 flu pandemic. Once again officials expected panic – so much so that many countries lobbied WHO not to call it a pandemic at all. Complacency has turned out to be a much bigger problem.)

Neil and I spent the last half of the 1980s studying how to persuade people to test their homes for radon and remediate if they found much. I’ll talk some more about that work tomorrow. It gave rise to the Precaution Adoption Process Model – mostly Neil’s model – which makes some very sophisticated predictions about what it takes to get people to take precautions against risks that aren’t emotionally upsetting.

number 12 In keeping with my leitmotif that good guys are no more honest than bad guys, though in the service of a better cause (and with less self-awareness about their dishonesty), let me tell you a radon story. The lion’s share of radon risk is to smokers, and people who live with smokers. This is because radon daughters (decay products of radon) attach themselves to smoke particles and ride them deep into people’s lungs. In less smoky environments radon is a lot less deadly.

EPA knew this from the outset. But it went out of its way to underplay the relationship between radon risk and smoking. It didn’t want to give nonsmokers a sense that radon wasn’t really very serious for them; and it didn’t want to give smokers a sense that if they coped with their radon problem it was okay to keep smoking. So all its materials for the general public averaged smoker and nonsmoker risk, thereby understating the risk to smokers by a little and overstating the risk to nonsmokers by a lot.

Does this remind you of anything with regard to flu? How about our endless mantra that flu kills 36,000 Americans a year? As most of you know, 90% of those 36,000 deaths are 65-and-over. Only about 3600 people under 65 die of flu in the average year. Seasonal flu (unlike pandemic flu) is mostly an inconvenience to the young, and a serious threat to the elderly. Seniors are 12.5% of the U.S. population and 90% of the U.S. seasonal flu deaths.

We don’t exactly stress that in our flu vaccination promotional campaigns. We want younger people to get vaccinated – partly to protect their grandparents, partly to build the market for vaccines, partly to build the habit of getting vaccinated, partly because a week of feeling rotten is no picnic. So without exactly lying, we try to persuade young people that flu is deadlier to them than it actually is.

(By the way, that has made it much harder to explain how the H1N1 pandemic differs from the seasonal flu. The pandemic so far is much less deadly than the seasonal flu to people over 65, and a little more deadly than the seasonal flu to people under 65. So far, the first effect is swamping the second. If it keeps going the way it’s going now, the pandemic will kill a lot fewer than 36,000 Americans this year … but more than 3,600 young Americans. It’s almost impossible to explain how much worse the pandemic is than the seasonal flu for young people, without first coming out of the closet about how rarely the seasonal flu kills young people.)

number 13 I don’t want to imply that the good guys never put telling the truth ahead of our programmatic goals. It happens. 1985 was my first year under contract to the Environmental Defense Fund – an arrangement that continued for more than 20 years. (Fred Krupp, EDF’s executive director, was a student of mine at Michigan in the 1970s; I helped EDF hire him.)

Early in my work with EDF, the organization was launching its signature campaign against dioxin. A factoid of enormous interest to the campaign communication people (including me) was that human breast milk contained detectable and increasing amounts of dioxin. The campaigners wanted to feature this fact in a lot of our messaging: “Because of dioxin pollution, it’s not safe to nurse your baby!”

EDF scientist Ellen Silbergeld said no way. The evidence that breastfeeding is preferable to bottle-feeding, she said, was a lot stronger than the evidence that trace amounts of dioxin in breast milk might be harmful to infants. Persuading mothers to bottle-feed their children out of a fear of dioxin would thus be doing them damage … and there was no way she would let us say that.

This is a standard of integrity that environmentalists and public health advocates haven’t always lived up to (nor has EDF itself) – think for example of all the messaging about mercury and other contaminants in fish, without thinking through whether contaminated fish might still be a healthier choice than red meat.

number 14 Also in the mid-to-late 1980s, risk communication was invented. The first national conference with the phrase “risk communication” in its title took place in January 1986 in Washington DC. It was called, simply, “National Conference on Risk Communication,” and was sponsored jointly by the Environmental Protection Agency, the National Science Foundation, and the Conservation Foundation. I gave two presentations.

There had long been related fields, of course: health communication, safety communication, and environmental communication all existed already. Two things distinguished the new field of risk communication.

First of all, unlike the other three, it didn’t start with the assumption that the risk is dangerous and the problem is apathy; rather, that first risk communication conference focused on figuring out what to do when people are unduly upset about small hazards. (EPA, for example, was very preoccupied with trying to figure out how to persuade people that most factories and hazardous waste sites weren’t all that dangerous to live near.) There were long panels with titles like “building trust and credibility” and “nuclear power and nuclear phobia.”

The second difference: From the outset, risk communication was conceptualized as a dialogue, not a monologue. Alerting apathetic people IS mostly a monologue. But in order to reassure people who are excessively alarmed, you need to listen to their concerns, take them seriously, even do something about them.

“Risk communication” has since expanded to include health, safety, and environmental communication – to include alerting the apathetic as well as reassuring the upset. But it started with the latter … and I got in on the ground floor.

number 15 That was also when I came up with the formula that’s on my business card and probably should be on my gravestone (and that put my kids through college): “Risk = Hazard + Outrage.” The radon work with Neil Weinstein and that first national risk communication conference helped me realize that I had been working on three different problems without distinguishing them properly.

Sometimes people are apathetic about a serious risk and need to be alerted; sometimes people are upset about a small risk and need to be reassured; sometimes people are upset about a serious risk and need to be guided. (The fourth possibility: Sometimes people are apathetic about a small risk – and need nothing at all!) Ever since Three Mile Island in 1979 I had been struggling toward an understanding that these three risk communication paradigms had very little in common. For a while I had it framed as “apathy” versus “hysteria” – but then I came up with a better frame: hazard versus outrage.

In 1987 I published my last article with the word “hysteria” in the title: “Communicating Radon Risk: Alerting the Apathetic and Reassuring the Hysterical” – and my first article with the word “outrage” in the title: “Facing Public Outrage.” Published in EPA Journal, “Facing Public Outrage” included this paragraph:

To the experts, risk means expected annual mortality. But to the public (and even the experts when they go home at night), risk means much more than that. Let’s redefine terms. Call the death rate (what the experts mean by risk) “hazard.” Call all the other factors, collectively, “outrage.” Risk, then, is the sum of hazard and outrage. The public pays too little attention to hazard; the experts pay absolutely no attention to outrage. Not surprisingly, they rank risks differently.

“Risk = Hazard + Outrage” became my trademark (though I never trademarked it).

number 16 Finally, also in the mid-to-late 1980s, I gave corporate consulting another try. Just as activists and public health agencies were the obvious clients for high-hazard, low-outrage risk communication, companies were the obvious clients for low-hazard, high-outrage risk communication. Manufacturers were endlessly enraging and terrifying their neighbors, even when they weren’t actually killing them. As with Three Mile Island, my corporate consulting emerged from my research.

My Rutgers University colleague Michael Greenberg, a geographer-turned-cancer epidemiologist, asked me to join him in some research on how the media cover environmental risk and what can be done to encourage better coverage. We ended up getting a series of grants from something called the Hazardous Substance Research Center, an industry-funded consortium. Once again fearing for my purity, I made a big point of insisting that I would not tolerate any interference from our corporate funders. In particular, I insisted, I would never let them edit out conclusions in our reports that weren’t to their liking. Rather to my disappointment, they never tried.

What they did do is invite me to come to talk to their companies about the implications of our findings for how companies should communicate about corporate pollution. My main messages were things like acknowledge your prior misbehavior, acknowledge your current problems, admit your emissions aren’t zero-risk, give credit to your critics for making you improve, etc. I dubbed this set of approaches “outrage management” – the kind of risk communication you should do when a risk is very upsetting but not necessarily very dangerous. By the start of the 1990s, I had become an increasingly successful corporate consultant … busy enough that I gave up my Rutgers professorship and went into full-time consulting.

number 17By the late 1980s – twenty years ago – the main outlines of my work were established. I try to divide my time as evenly as I can among the three risk communication paradigms:

As you might expect, outrage management pays the best, and precaution advocacy pays the worst. So I get to tell myself that my corporate consulting subsidizes my work with activist groups and public health agencies.

number 18 But I shouldn’t leave you with the impression that the outrage management is always for corporations, while the precaution advocacy is always for activists and public health agencies. There are all sorts of unexpected combinations:

  • Corporations, for example, often hire me to figure out how to persuade employees to take safety rules more seriously. That’s precaution advocacy.
  • A few years ago, the Environmental Defense Fund successfully pressured EPA to force the chemical industry to do an enormous amount of toxicity testing on high-volume chemicals that had previously been grandfathered because they were commonly used long before environmental regulations existed. As you might expect, most toxicity testing of chemicals is done on animals – and People for the Ethical Treatment of Animals launched a massive campaign against EDF for inspiring the murder of untold millions of rodents. So one of the nation’s leading environmental groups was in the traditional corporate role: under attack by activists. All the habits of hyperbole that had worked pretty well for EDF when it was playing offense got in the way when it needed to play defense. EDF communicators had to stretch new muscles. They had to do outrage management.
  • Obviously, the same thing happens when public health agencies are under attack by anti-vaccination activists. And they typically handle it badly too. The hard outrage management lessons the chemical, oil, waste management, and energy industries have had to learn over the last 30 years – whether they learned those lessons from me or from someone else – have yet to be learned by most public health agencies. (I’ll talk about some of those lessons tomorrow.)
  • Organizations that are good at precaution advocacy are usually rotten at outrage management, because the strategies are so diametrically opposed. Outrage management, paradoxically, requires more integrity. You can exaggerate some when you’re trying to get apathetic people more concerned about a health problem – which is what health department communication often tries to do. But when people aren’t apathetic, when they’re skeptical, suspicious, worried about vaccine safety and government interference in their lives, when you’re trying to reassure them rather than alert them, then even small exaggerations (pretending to be 100% right when you’re only 98% right) can do you in.

number 19 I started trying to retire in 2001. Then 9/11 happened, and it seemed like the wrong time for a risk communication expert to retire. Then anthrax happened. Then SARS happened. Then bird flu happened. Then swine flu happened. After this pandemic is over I think maybe I’ll try to retire again – but nobody believes me anymore when I say it. In the meantime, the balance has shifted a bit for me – shifted back closer to what it was at the start of my career: less corporate work, more public interest work; less outrage management, more precaution advocacy and crisis communication.

But there’s more outrage management mixed in with the precaution advocacy and crisis communication than there used to be. Forty years ago when I was helping environmental groups persuade people to recycle their bottles, cans, and newspapers, lots of people didn’t want to bother – but only really kooky people worried that recycling might actually be harmful, that it might be part of a lefty plot to take over the world. Today when you try to persuade people to get their flu shots – this year, both of their flu shots – you face not just apathy about influenza but also outrage about vaccination, government intrusiveness, and government untrustworthiness.

number 20 If there’s an overarching concept that has dominated the 40 years since I switched my dissertation topic from book publishing economics to environmental communication, I think it’s the problem of trust. I don’t mean the problem that the public doesn’t trust my clients. It’s true that the public doesn’t trust my clients, but that’s not a problem – that’s an achievement. It would be a mistake to trust my clients … any of them. And it would be a pretty temporary mistake. Institutions that are trusted tend to abuse the trust; and then the public finds out; and then they’re not trusted anymore.

The problem isn’t that the public doesn’t trust my clients. The problem is that my clients expect the public to trust them. They keep asking to be trusted, instead of working to be accountable so they don’t need to be trusted.

And the problem is that my clients don’t trust the public.

That’s true whether the client is a large multinational corporation or an environmental activist group or a public health agency.

My typical corporate client is doing much less environmental harm than its critics believe – though more environmental harm that it claims. The company is arrogant, unresponsive, and less than candid. Its stakeholders figure that out (or sense it intuitively), and mistrust the company. They become outraged. Their justified mistrust and outrage lead them to reach inaccurate conclusions about substantive issues, the hazard. (The logic here: If you’re arrogant, unresponsive, and less than candid, you’re probably doing something deadly as well.) The company rightly notices that the public is over-estimating the hazard. Instead of attributing that to the company’s own outrage-arousing behavior, the company decides the public is stupid, all too easily manipulated by activist lies. Obviously, the company figures, the stupid public can’t be expected to keep complicated, nuanced truths in context; the stupid public is bound to overreact to any factoids that seem to suggest what they’re doing is dangerous. So the company decides to keep such factoids under wraps insofar as possible. And of course the company is itself outraged at the public’s unfair accusations. All of this makes the company behave in ways that are even more arrogant, unresponsive, and less-than-candid. That of course exacerbates the public’s mistrust and outrage, and the cycle continues.

In a nutshell, then, the public gets the outrage right – people are right that the company can’t be trusted. And the company gets the hazard right – the company is right that what it’s doing is much more benign than the public imagines. Obviously, this isn’t always true. Sometimes companies are trustworthy, and sometimes their emissions are horribly dangerous. But at least in the developed world, companies can’t get away with the sorts of environmental sins they routinely committed in past decades. They don’t dare try. As a rule, companies today are far more dishonest than dangerous.

I think much the same thing is true of public health agencies. Agencies still face a lot of apathy – I don’t want to imply that that traditional precaution advocacy problem is gone. But public health agencies increasingly face a lot of public outrage and mistrust as well. The outrage and mistrust result from a long history of “shaping” the truth – not quite lying, but working hard to emphasize the parts of the truth that will get people to do what the agency wants them to do (for their own good), and leaving out the parts of the truth that might mislead them into making unwise decisions. People sense that that’s what the agency is doing, and it makes them mistrustful. The agency’s reaction to the public’s mistrust is to feel vindicated in its partial dishonesty – and it becomes even less willing to acknowledge the pieces of the truth that it is fears will lead people to make unwise choices. The agency’s critics, of course, trumpet and exaggerate exactly those pieces – which loom ever-larger because the agency hasn’t been candid about them.

number 21 I could give endless examples, but I want to end with just one – one that I don’t think anyone in this room is guilty of. It concerns vaccination – not flu vaccination or MMR vaccination – but polio vaccination. As some of you know, there are two polio vaccines; one’s a shot and the other is delivered orally. The oral vaccine is a live, weakened vaccine. It is significantly less safe than the injected (dead) vaccine, and it’s now illegal in most developed countries. But it’s a lot cheaper, and has other advantages that make it the vaccine of choice for developing countries, including the fact that the vaccine virus spreads to other non-vaccinated children – without their knowledge or consent – giving them protection against polio too. But about one vaccinee in a million gets polio from the vaccine itself, and from time to time there’s an outbreak of vaccine-derived polio cases in children who were not directly vaccinated. (Vaccine virus sheds in the stools of vaccinated children, and very occasionally reverts to a more virulent form that starts circulating in the community. Unvaccinated people are then at risk of catching polio from the mutated polio vaccine virus in their environment.)

A few years ago, Nigeria experienced the largest such outbreak on record. It started in 2005 and was reported to the World Health Organization and the U.S. Centers for Disease Control and Prevention in September 2006. But health authorities were reluctant to acknowledge it, especially in Nigeria. Why were the authorities reluctant to say anything? Religious leaders in parts of some Muslim countries, including Nigeria, have opposed the polio eradication program. They have claimed, among other things, that the program is a western genocidal plot. The polio risk from the oral vaccine is the germ of truth in that false belief, although of course the vaccine prevents orders of magnitude more polio than it causes.

Authorities feared that acknowledging the Nigerian vaccine-derived polio virus outbreak would give credence to the claims of anti-vaccine imams. Instead, of course, suppressing news of the outbreak has given credence to those claims.

I should add that the polio vaccination program in Nigeria hasn’t just withheld information about the vaccine-derived polio outbreak. It has routinely lied – flat-out lied – about whether the polio vaccine can give a vaccinee polio. The agencies managing the polio eradication effort track these cases; they write about them in MMWR and other professional publications. But when a Nigerian parent, journalist, or imam points out that a recently vaccinated child has come down with polio, the agencies (even as they test to see if this is a case of vaccine-acquired polio) routinely assert that the polio vaccine is absolutely safe, that the child must have been infected with the polio virus before getting vaccinated. They think this dishonesty is saving lives by protecting the credibility of the vaccination campaign. In the short term, they may be right. In the long term, it is costing lives by undermining the credibility of public health itself.

number 22 If you can’t think of any comparable examples from your own agency’s communications about public health controversies, catch me during lunch and I’ll see if I can’t prime the pump for you.

number 23 Am I claiming that public health agencies are just like multinational corporations? No. Corporations pay their people better. And public health agency people are more committed to the public interest than most corporate employees. The shared characteristic I am emphasizing is this: You both often face publics that don’t entirely trust you. You are mistrusted because you don’t always tell the whole truth. And you’re afraid to tell the whole truth because you’re mistrusted … and because you don’t trust the public.

number 24 Breaking that vicious cycle is a core job for the public health information professionals in NPHIC. It’s two jobs, really.

  • Sometimes – more often than not, I believe – your technical sources inside your agency give you an unbalanced story to start with. They tell you to tell people that hand-washing is the best way to prevent flu except vaccination, without telling you that flu is transmitted mostly by droplets and there is very little evidence that hand-washing helps much. They tell you to tell people that it’s important to get the seasonal flu vaccine early this year, without telling you that pandemics often supplant seasonal flu strains and there may be no seasonal flu this year at all. They tell you part of the truth, and expect you to write it that way. And too often you do. For those of you with a strong journalism background, go back to your roots. Ask tough questions inside your agency. Do a little internal investigative reporting. Exercise your journalistic shit-detector. Push your sources to tell you the whole story.
  • Once you have the whole story, then you need a different skill set: a risk communication skill set rather than a journalistic skill set. Over the past two decades, PR people in the corporate world have made a transition from a stenographic role to a policy role. They advise CEOs that “if we do that, people are going to get angry” and “if you we say that, the opposition will be all over it.” Sometimes they misuse their policy role, helping their company be more deceptive more successfully (for a little while). But increasingly, corporate PR people are counseling something closer to candor. Realizing that bad news almost always gets out eventually, realizing that mistrust is high, they urge their leadership to trust the public with more of the truth. Public health communicators are behind in this transition from stenographer to policy-maker, but they’re getting there. Get there as quickly as you can. The biggest service you can render to the public health enterprise is to persuade health officials to trust the public with more of the truth.

Copyright © 2009 by Peter M. Sandman

For more on outrage management:    link to Outrage Management index
For more on infectious diseases:    link to Pandemic and Other Infectious Diseases index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.