Articles categorized as:  link to Precaution Advocacy index

Dr. Peter M. Sandman
Precaution Advocacy
(High Hazard, Low Outrage)

I got my start in risk communication – before the phrase was coined – as someone who helped environmental groups arouse public concern about the need for recycling, the dangers of factory emissions, and the like. My earliest publications, most of which are not available on this site or anywhere on the web, dealt with how to design and carry out environmental advocacy campaigns. For a list of these early publications, check out the entries for the 1970s in my C.V.

In the terminology I now use, my focus was on high-hazard low-outrage situations. The job was precaution advocacy: to arouse some healthy outrage and use it to mobilize people to take precautions or demand precautions.

I never abandoned this focus, but starting in the 1980s more of my time was spent on the opposite problem: reducing outrage in low-hazard high-outrage controversies. (See Outrage Management (Low Hazard, High Outrage).) There were two main exceptions.

One exception was work with Neil Weinstein and others on radon. Radon is an odorless, colorless gas and a naturally occurring carcinogen; it accumulates in buildings (especially well insulated buildings) and ends up in the lungs of the residents. When I first got involved in radon communication, authorities were afraid people would panic; Neil and I assured them the dominant problem would be apathy, not panic – and we launched a research program to figure out how to persuade people to test their homes for radon and, if high levels were found, to mitigate the problem.

The other major exception was employee safety. From time to time a client that had first employed me to help figure out how to reduce community outrage about one set of hazards came back for help when employees were ignoring or mishandling a more serious set of hazards. A mining company in Australia, for example, needed to persuade employees to obey safety regulations. The work I had done for environmental activists on arousing outrage was helpful here, but other approaches were needed too – sometimes the problem wasn’t really insufficient employee outrage at the hazard, but rather excessive employee outrage at the precaution: safety rules and safety equipment that chafed. I haven’t done any radon work in a long time, but I continue to consult and write on environmental activism and on risk communication aspects of employee safety.

Much more recently I have become heavily involved in a third venue for precaution advocacy: mobilizing people to take action with regard to infectious disease outbreaks, especially the risk of an influenza pandemic. My writing on pandemic communication is indexed in the “Pandemic Flu and Other Infectious Diseases Index.” But some articles focusing on arousing apathetic audiences are also listed here, while some that focus on helping people cope when the time comes are also listed in the “Crisis Communication Index.”

So the “Precaution Advocacy” articles listed below are derived from four main sources – work with environmental activists from the early 1970s through the present; work on radon communication in the 1980s and 1990s; work on employee safety from the 1990s through the present; and work on pandemic preparedness in the 2000s.


Topical Sections in Precaution Advocacy


On Precaution Advocacy Generally

  • When Residents Say ’No‘ to Aerial Mosquito Spraying

    by Michael Schulson

    Posted on the Undark Magazine website, October 25, 2019

    Mosquito Spraying versus Eastern Equine Encephalitis: Two Small Risks Compete for Public Outrage

    Email from Peter M. Sandman to Michael Schulson, October 16, 2019

    On October 16, 2019, I received an email from Michael Schulson, "a journalist for Undark Magazine, which covers science and society for a national audience from its headquarters at MIT." He wanted to interview me by phone on a controversy in southwestern Michigan about aerial mosquito spraying against Eastern Equine Encephalitis (EEE). Instead of arranging a call, I sent him a brief email outlining my view: that both aerial spraying and EEE are small risks; that for understandable reasons spraying tends to arouse more public outrage than mosquito-borne diseases like EEE; and that in their effort to counter anti-spraying outrage public health officials tend to hype the (small) risk of mosquito-borne diseases while playing down the (also small) risk of spraying. Michael used portions of my email at the end of his article, which quite rightly focused on the particulars of the Michigan controversy.

  • When It’s Okay for Health Officials to Panic, and When It’s Not

    by Faye Flam

    Posted on the Bloomberg Opinion website, October 6, 2019

    Honest versus Dishonest Teachable Moments in Public Health Warnings

    Interview with Peter Sandman by Faye Flam, September 24, 2019

    Dishonest E-Cig Warnings and the Ethics of Health Scares

    Two emails from Peter M. Sandman to Faye Flam, September 11 and September 13, 2019

    In May 2019, Bloomberg reporter Faye Flam interviewed me link is to an audio MP3 file for a story on e-cigarette risk communication – and my view that the public health establishment has been dishonestly alarmist about vaping, so much so that it risks scaring people into smoking instead. On September 10, 2019, she emailed me to ask about a follow-up interview. We exchanged a few emails about whether Bloomberg would let her write another “pro-vaping” article, given Michael Bloomberg’s fervent opposition. So we started emailing back and forth about “health scares” more generically. Two of my emails to Faye strike me as worth posting: one on September 11 about the recent spate of lung injuries linked to vaping (especially vaping illegal marijuana); the other on September 13 about when it is or isn’t appropriate for public health officials to try to frighten the public. Our eventual September 24 interview dealt largely with e-cigs but also addressed some other health scares (bird flu, equine encephalitis, red meat, climate change), as did her resulting October 6 article. The interview, of course, is a lot more detailed than the very brief article.

  • Are E-Cigs a Crisis? It’s Risky to Call Them ‘Unsafe’

    by Faye Flam

    Posted on the Bloomberg Opinion website, June 3, 2019

  • The U.S. Public Health Establishment Risks Scaring People into Smoking
    (Note: Link launches an MP3 audio file (41MB, 29 min.) on this site.)

    Interview with Peter Sandman by Faye Flam, May 28, 2019

    I am very nearly retired, but when Bloomberg commentator Faye Flam asked to interview me about e-cigarettes, I couldn’t resist saying yes. I have been highly critical of how the U.S. public health establishment smears e-cigs at least since 2015, when I posted “A Promising Candidate for Most Dangerously Dishonest Public Health News Release of the Year.” Precaution advocacy often exaggerates, and I am used to hyperbolic public health warnings about, say, the dangers of vaccine-preventable diseases. But such warnings can save lives even if they’re less than honest, which some say justifies the dishonesty. Warnings about e-cigs, on the other hand, could convince people that they might just as well smoke instead – a profound disservice if, as seems likely, vaping is an order of magnitude safer than smoking. Faye’s article is based on more than just her interview with me, and the interview has a lot of information she didn’t use in the article. So you might want to check out both.
    (In the same interview, Faye also asked me about the failed Dengvaxia vaccine campaign in the Philippines. I’ll post that part of the interview with a link to her Dengvaxia article if she writes one.)

  • False Sense of Security

    Website column

    Posted: May 25, 2018

    For decades I have maintained a file of articles in which experts claimed some precaution they disapproved of could give people a “false sense of security.” But until recently I didn’t focus enough on false sense of security as a genuine risk communication and risk management problem. Every precaution is also a communication; it tells people that they’re safer than they were previously, and thus implies that alternative precautions may be superfluous. Sometimes a precaution does a better job of making people feel safer than it does of making them actually safer – thus inculcating a false sense of security and potentially undermining other precautions. This column looks at both intentional and unintentional inculcation of a false sense of security. It tries to make a case for not overselling precautions. It also addresses some related phenomena: false sense of insecurity (precautions that work better than they seem); risk homeostasis and compensation; etc. The column makes substantial use of flu vaccination as an example, so I’ve indexed it under Infectious Diseases as well as Precaution Advocacy.

  • Why Do Risk Communication When Nobody’s Endangered and Nobody’s Upset (Yet)?

    Website column

    Posted: April 19, 2018

    Years ago I distinguished three paradigms of risk communication: precaution advocacy when hazard is high and outrage is low; outrage management when hazard is low and outrage is high; and crisis communication when both are high. But I have endlessly claimed that there’s no risk communication to be done when hazard and outrage are both low. That’s true when you’re pretty sure hazard and outrage will remain low. This column is a primer on what to do when one or the other is expected to climb: pre-precaution advocacy when hazard is likely to climb; pre-outrage management when outrage is likely to climb; and pre-crisis communication when both are likely to climb. There’s also an introductory section on how to surveil for increasing outrage. (Hazard surveillance isn’t my field.)

  • Confirmation Bias (Part One): How to Counter Your Audience’s Pre-Existing Beliefs

    Website column

    Posted: October 12, 2016

    Confirmation bias is our universal tendency to hang onto our beliefs in the face of evidence to the contrary. This column begins by describing the cognitive defenses that confirmation bias relies on: selective exposure, selective attention, selective perception, framing, selective interpretation, and selective retention. Then the column addresses strategies risk communicators can use to reduce their audience’s confirmation bias. The key is to avoid challenging the audience more than necessary by finding things (sometimes even irrelevant ones) to reinforce or agree with. The column closes with pointers on how to disagree when disagreeing is necessary. The entire column is about ways to overcome your audience’s confirmation bias; a sequel on ways to overcome your own confirmation bias is also on this site.

  • Why Most People Don’t Pay Much Attention to the Fact that Alcohol Is a Carcinogen

    by Peter M. Sandman

    Email in response to a query from Jennifer Chaussee, September 16, 2016

    On September 14, 2016, Jennifer Chaussee of Wired Magazine emailed me about a story she was writing about “why there is such low awareness around the link between alcohol consumption and increased cancer risk.” She said she was particularly interested in how my hazard-versus-outrage concept might be used to explain public apathy about the drinking-and-cancer connection. My emailed response two days later did cover the ground she wanted me to cover: the ways in which drinking-and-cancer is low-outrage. But I also discussed some other risk perception aspects of the issue. Jennifer’s excellent September 22 story was called “The Muddled Link between Booze and Cancer,” and focused on controversies over research results about the link. Her planned sidebar on outrage and public perception was dropped for space reasons. I’m posting the email I sent her anyway. I’m posting it under precaution advocacy even though it doesn’t really address how to get drinkers more worried about alcohol-and-cancer; it’s a good explanation of why that’s a tough assignment.

  • Scaring People: The Uses and Limitations of Fear Appeals
    (Note: Link launches an on-site audio file (50MB, 53 min.)

    Part One of a two-part interview with Peter M. Sandman by George Whitney of Complete EM, July 22, 2016.

    George Whitney runs an emergency management consulting company called Complete EM. His website features a blog and a podcast series. On July 22, 2016 he interviewed me by phone for nearly two hours. He edited the interview into two podcasts, which he entitled “Dr. Peter Sandman – Risk Communication” and “Dr. Peter Sandman – Crisis Communication.” I have given them new titles.

    This interview segment isn’t really about emergency management or crisis communication at all. It’s about pre-crisis communication – a part of what I call precaution advocacy. When he briefed me for the interview, George had told me he wanted to focus on fear appeals. He thought emergency management professionals relied too much on fear in their warnings about earthquakes and other natural disasters, and he wanted to know whether I agreed. So for the first 45 minutes or so we talked about the uses and limitations of fear appeals. At the end of what became Part One of George’s two-part podcast, he asked me to reflect on what had changed in my 40+ years as a risk communication consultant. I cited two big changes: the slow migration from craft to science, and the growing understanding of what it takes to calm people who are more upset about some risk than you think they should be. (Part Two is “Crisis Communication for Emergency Managers.”)

  • High-profile cancer reviews trigger controversy

    by Kai Kupferschmidt

    Published in Science [subscription required], June 24, 2016

  • How IARC Talks about Coffee and Very Hot Beverage Carcinogenicity: “How Bad Is It” vs. “How Sure Are You”

    by Kai Kupferschmidt and Peter M. Sandman

    Email exchange with Kai Kupferschmidt of Science magazine, June 16–17, 2016

    Processed Meats, Cancer, and Risk Communication: The World Health Organization’s Non-Warning Warning

    by Peter M. Sandman

    October–November 2015 incomplete draft article sent as an attachment to Kai Kupferschmidt, June 17, 2016

    On June 16, 2016, Science reporter Kai Kupferschmidt sent me an email asking me to comment for a story he was writing on a June 15 announcement by the International Agency for Research on Cancer (IARC) about the possible carcinogenicity of coffee and very hot beverages. The issues this announcement raised were very similar to the issues raised by an earlier IARC announcement about the carcinogenicity of processed and red meats. The issue of greatest interest to me: the distinction between the seriousness of a risk (how bad is it?) and the quality of the evidence about that risk (how sure are you?). In my June 17 response to Kai I referenced, and attached, a draft article I never finished about that earlier IARC announcement. Kai’s story had room for one quote from my email, but nothing from my unfinished article.

  • Car Crashes and Mass Extinction Events: Communicating about High-Probability Low-Magnitude Risks

    by Peter M. Sandman

    Email in response to a query from Faye Flam, May 10, 2016

    An article in the April 29, 2016 issue of The Atlantic focused on a study claiming that the average person is likelier to die in a mass extinction event than in a car accident. On May 4 Faye Flam asked me to comment for an article she wanted to write for Bloomberg News about the resulting controversy, noting: “I think there’s probably a bigger story about misleading use of statistics and confusion about risk.” The “bigger story” I saw was a bit different: how to communicate about high-magnitude low-probability risks – the sorts of risks that people either exaggerate (if the risk arouses a lot of outrage and they focus on its high magnitude) or shrug off (if the risk arouses very little outrage and they focus on its low probability). On May 10 I emailed Faye this response. She wrote her story, but on May 17 the Bloomberg News editors decided not to run it, judging that the news peg – the Atlantic mass extinction article – was no longer of much interest to their readers.

  • A Must-Read Treatment of the CDC’s Campaign of Deception Regarding E-Cigarettes

    by Michael Siegel

    Posted on “The Rest of the Story: Tobacco News Analysis and Commentary,” June 24, 2015

    Dr. Michael Siegel teaches in the Department of Community Health Sciences, Boston University School of Public Health. He’s an M.D. with a longtime interest in public health communication, whose writing has migrated from an anti-smoking focus to an anti-anti-vaping focus. If you’re interested in the possibility that electronic cigarettes might not be as bad as public health’s anti-e-cig vendetta claims, Dr. Siegel’s blog is must reading. So of course I am pleased that he describes as must reading my May 2015 column about a dishonest news release from the U.S. Centers for Disease Control and Prevention about teenage use of tobacco. His article is a solid summary of my column’s critique of how the CDC willfully misinterpreted the newest data on teenage smoking (down) and vaping (up), intentionally fostering the misimpression that the news was bad in order to nurture its anti-vaping campaign.

  • CDC, Sandman, and finding an “honest” appraisal of e-cigarettes

    by Andrew Holtz

    Posted on the HealthNewsReview website, June 12, 2015

    HealthNewsReview does wonderful online assessments of medical reporting and medical public relations, analyzing both exemplary and substandard work. Though its focus is mostly on drugs and treatments, not public health, I thought it might be interested in my May 2015 column excoriating a very dishonest news release from the U.S. Centers for Disease Control and Prevention about teenage use of tobacco. In my view, the release was way too alarmist about an increase in teenage vaping, failing to pay appropriate attention to the decrease in teenage smoking, to the near-certainty that vaping is orders of magnitude less dangerous than smoking, and to the possibility that vaping might turn out to be a replacement for smoking rather than a gateway to smoking. I sent HealthNewsReview a quick note about the column, and was delighted when it asked medical journalist Andrew Holtz to write something about it. Holtz summarized my critique, did his own assessment of the CDC release, and went on to offer some generic advice (and useful sources) for reporters trying to cover the e-cigarette controversy.

  • A Promising Candidate for Most Dangerously Dishonest Public Health News Release of the Year

    Website column

    Posted: May 27, 2015

    This column is about electronic cigarettes (e-cigs) – specifically about an April 2015 CDC survey report, news release, and press briefing announcing a substantial increase in teenage vaping (use of e-cigs) in the United States. What infuriated me – and led me to entitle this column “A Promising Candidate for Most Dangerously Dishonest Public Health News Release of the Year” – was the failure to pay appropriate attention to the decrease in teenage smoking, to the near-certainty that vaping is orders of magnitude less dangerous than smoking, and to the possibility that vaping might turn out to be a replacement for smoking rather than a gateway to smoking. The column documents these dishonesties not just in the release but in the report and briefing as well. In fact, the briefing is by a considerable margin the most misleading of the three. But the release undoubtedly had the most influence on how the media covered the story.

  • Talking to the Public about Emergency Preparedness
    Link launches an on-site audio file (79.5MB, 49 min.)

    Interview with Peter M. Sandman by Marisa Raphael, New York City Department of Health and Mental Hygiene, February 27, 2014

    Marisa Raphael is Deputy Commissioner at the Office of Emergency Preparedness and Response of the New York City Department of Health and Mental Hygiene. She is also participating in the National Preparedness Leadership Institute (NPLI) at Harvard University. On February 27, 2014, she interviewed me for an hour by telephone on behalf of an NPLI project on ways to improve emergency preparedness communications with the general public. Although we spent a little time at the end of the interview covering some basics for communicating mid-crisis, we stuck mostly to pre-crisis communication, a kind of precaution advocacy. We covered two main topics. First we talked about why it’s so hard to build citizen support for government emergency preparedness expenditures, and what kind of messaging strategies are likeliest to lead to such support. Then we switched to a more conventional topic: how to motivate people to do their own personal, family, or neighborhood emergency preparedness.

  • Terrorists vs. Bathtubs
    Link goes off-site to a page with a link to this 10-min. audio.

    (Edited) interview with Peter Sandman by Brooke Gladstone, June 20, 2013

    Aired on National Public Radio’s “On the Media” and posted on its website, June 21, 2013

    link goes to a page where you will find the MP3Risk Communication in Practice
    Link launches an on-site audio file (79.5MB, 49 min.)

    (Complete) interview with Peter Sandman by Brooke Gladstone, June 21, 2013

    Brooke Gladstone of “On the Media” interviewed me in my home for 49 minutes. We started out talking about claims by opponents of NSA telephone and email surveillance (in the wake of the Edward Snowden leaks) that “more people have died from [whatever] than from terrorism” – and why these sorts of risk comparisons are unlikely to be convincing. That soon got me to the distinction between hazard and outrage. But Brooke didn’t let me do my usual hazard-versus-outrage introductory shtick. Instead, she kept asking for specifics – examples of how precaution advocacy and outrage management strategies work in practice. Toward the end of the interview, she pushed me to shoot from the hip about applications I hadn’t thought through: How would I use risk communication to defend government surveillance? To oppose it? To defend shale gas “fracking”? To oppose that? The interview that resulted is a different sort of introduction to risk communication than the one I usually give. The 10-minute broadcast segment is nicely edited; it’s very smooth and covers most of my main points. But I prefer the roughness and detail of the complete interview.

  • Managing Risk Familiarity

    Website column

    Posted: November 3, 2012

    Familiar risks lose their capacity to provoke outrage, and people get careless. Unfamiliar risks, on the other hand, are likelier to be upsetting. So if you’re doing outrage management – if you’re a factory manager trying to keep your neighbors calm, for example – familiarity is your ally. But if you’re doing precaution advocacy – an activist trying to arouse public concern, or a safety professional trying to motivate employees to wear their hardhats – familiarity is your enemy. Either way, managing familiarity is a significant part of the risk communication job. Those are the basics. But this column goes beyond the basics, getting down in the weeds of managing risk familiarity. It focuses especially on two distinctions: the distinction between familiarity and perceived familiarity (fluency); and the distinction among familiarity with the overall situation, familiarity with the risk, and familiarity with the bad outcome (memorability).

  • Misoversimplification: The Communicative Accuracy Standard Distinguishes Simplifying from Misleading

    Website column

    Posted: June 5, 2012

    The need to simplify technical content is not an acceptable excuse for “simplifying out” information that is essential to credibility – especially information that seems to contradict your message, and that will therefore undermine your credibility if you leave it out and your audience learns it elsewhere. The obligation to include that sort of information is called the communicative accuracy standard; the failure to include it might appropriately be called “misoversimplification.” The column distinguishes three levels of misoversimplification, depending partly on how controversial the issue is and partly on whether you’re on the warning (precaution advocacy) or reassuring (outrage management) side. The three levels are illustrated with infectious disease examples: whooping cough, bird flu, and polio.

  • Motivating Attention: Why People Learn about Risk … or Anything Else

    Website column

    Posted: March 31, 2012

    The audience of precaution advocacy messages is quite likely to be apathetic, to find the information (safety information, for example) boring. This column outlines the only four ways I know to get people to learn risk information or any information. The first answer, learning without involvement, requires more budget than precaution advocacy campaigns usually have. The second answer, interest/entertainment, is also tough to achieve, though it’s always worth trying. So the column focuses mostly on the remaining two options. Giving people a “need to know,” such as a pending decision that requires the information, is a powerful tool of precaution advocacy. Also powerful, and psychologically much more complex, is getting people to see the information as ammunition – for example, motivating them to do something they’ve never done before, and then offering the information as a rationale that helps them makes sense of the new behavior.

  • Explaining and Proclaiming Uncertainty: Risk Communication Lessons from Germany’s Deadly E. coli Outbreak

    Website column by Peter M. Sandman and Jody Lanard

    Posted: August 14, 2011

    Together with my wife and colleague Jody Lanard, I have long advised clients to release risk information early – and since early information is almost always uncertain, to acknowledge the uncertainty. But even when clients (and non-clients) do what we consider a pretty decent job of acknowledging uncertainty, they often end up in reputational trouble when they turn out wrong, largely because journalists and the public misperceive and misremember their statements as having been far more confident than they actually were. So we have come to believe that it’s not enough to acknowledge uncertainty; you have to proclaim uncertainty, repeatedly and emphatically. This long column uses a severe German E. coli food poisoning outbreak in 2011 to explore the complexities of proclaiming uncertainty: the myriad ways government agencies and industry spokespeople get it wrong, and some recommendations for getting it right … or at least righter. Proclaiming uncertainty is important in all kinds of risk communication – outrage management as much as precaution advocacy and crisis communication. But our focus here is mostly on how to warn people about an imminent, uncertain risk: in this case, how to tell people which foods not to eat because you think they might be contaminated and deadly.

  • The Law of Conservation of Outrage: Outrage Is Limited – Do You Need More or Less?

    Website column

    Posted: April 14, 2011

    I speak and write endlessly about ways to increase people’s outrage when you think they’re insufficiently upset about a serious risk and ways to decrease their outrage when you think they’re excessively upset about a not-so-serious risk. I call these two kinds of risk communication “precaution advocacy” and “outrage management” respectively. This column makes a point I too often forget to mention: Except in emergencies (real or imagined), it’s impossible to get people more or less outraged. Mostly what we do is reallocate their outrage. The column calls this “the Law of Conservation of Outrage,” and discusses six corollaries that are fundamental to risk communication: the natural state of humankind vis-à-vis any specific risk is apathy; outrage is a competition; there’s no reason to worry about turning people into scaredy-cats; if people are more outraged at you than the situation justifies, you’re doing something wrong; excessive outrage aimed at you isn’t your critics’ fault; and outrage causes hazard perception – and we know what causes outrage.

  • Presented to the National Public Health Information Coalition, Miami Beach FL, October 21, 2009
    Posted: January 2, 2010

    Although this six-hour seminar was entitled “Three Paradigms of Radiological Risk Communication,” NPHIC asked me to go easy on the “radiological” part and give participants a broad introduction to my approach to risk communication, mentioning radiation issues from time to time. So that’s what I did.

    Fair warning: These are not professional videos. NPHIC member Joe Rebele put a camera in the back of the room and let it run. You won’t lose much listening to the MP3 audio files on this site instead.

    Part 1 (90-min.)

    Part One is a introduction to the hazard-versus-outrage distinction and the three paradigms of risk communication.

    Part Two (155 min)

    Part Two discusses the seesaw and other risk communication games (thus completing the introductory segment), then spends a little over an hour each on some key strategies of precaution advocacy and outrage management.

    Part Three (72-min.)

    Part Three is a rundown on some key crisis communication strategies.

    See especially Part Two.

  • The Precaution Adoption Process Model link is to a pdf file

    by Neil D. Weinstein, Peter M. Sandman, and Susan J. Blalock

    Health Behavior and Health Education, 4th. ed., edited by Karen Glanz, Barbara K. Rimer, and K. Viswanath (San Francisco: Jossey-Bass, 2008), pp. 123–147.

    The Precaution Adoption Process Model (PAPM) was developed mostly by Neil Weinstein, with some help from me. It is an attempt to identify the stages people must pass through on the way to adopting a new precaution: unaware, uninvolved, undecided, decided to act, acting, and maintaining action. It also tries to identify the interventions most likely to move people from one stage to the next. This 2008 book chapter summarizes the PAPM – how it differs from non-stage theories and competing stage theories of health-protective behavior; the justification for the stages specified; the advantages of stage-matched interventions; research testing the PAPM and research using it; etc. Two earlier articles applying the PAPM to a specific example, radon risk, are also on this website: “A Model of the Precaution Adoption Process: Evidence From Home Radon Testing” and “Experimental Evidence for Stages of Health Behavior Change: The Precaution Adoption Process Model Applied to Home Radon Testing.” This chapter is a better introduction: more recent, broader, and less quantitative.

  • “Watch Out!” – How to Warn Apathetic People

    Website column

    Posted: November 9, 2007

    This column is a primer on precaution advocacy – that is, on high-hazard low-outrage risk communication, where the job is to increase outrage and thus to motivate apathetic people to take precautions (or demand precautions). Apathy isn’t always the problem when people are ignoring a serious risk – they could be in denial, for example, or they could have reasons to dislike the recommended precautions. But when apathy is the problem, this column is a good place to start. It’s a quick rundown on twenty precaution advocacy basic principles.

  • When to Release Risk Information: Early – But Expect Criticism Anyway

    Website column by Peter M. Sandman and Jody Lanard

    Posted: April 16, 2005

    In February 2005, the New York City health department issued a warning about a possibly disastrous new strain of AIDS. It was widely criticized for alarming people before it had solid evidence that the strain was spreading. Also in February 2005, the United Kingdom’s Food Safety Authority held off announcing that many prepared foods were contaminated with tiny amounts of the banned red dye Sudan 1, because it wanted to prepare a list of affected products first. It was widely criticized for the delay. Obviously, when to release risk information is a tough call. In this column, Jody Lanard and I lay out the pros and cons, and conclude that early is almost always better than late. We also analyze the New York City decision in detail, and offer some ways to reduce the downsides of early release.

  • Worst Case Scenarios

    Website column

    Posted: August 28, 2004

    Most of this long column is addressed to risk communicators whose goal is to keep their audience unconcerned. So naturally they’d rather not talk about awful but unlikely worst case scenarios. The column details their reluctance even to mention worst case scenarios, and their tendency when they finally get around to discussing them to do so over-reassuringly. It explains why this is unwise – why people (especially outraged people) tend to overreact to worst case scenarios when the available information is scanty or over-reassuring. Then the column lists 25 guidelines for explaining worst case scenarios properly. Finally, a postscript addresses the opposite problem. Suppose you’re not trying to reassure people about worst case scenarios; you’re trying to warn them. How can you do that more effectively?

  • When People Are “Under-Reacting” to Risk

    Website column

    Posted: July 14, 2004

    When you think people are under-reacting to a risk, the usual diagnosis is “apathy” and the usual prescription is some sort of precaution advocacy: “This could kill you! Here’s how to protect yourself.” This short column is a checklist of questions to consider – in sequence – before jumping to the conclusion that apathy is the right diagnosis and precaution advocacy is the right prescription. Some of the alternatives (not paying attention, for example) are very familiar to safety professionals. Others (such as problems with self-efficacy and fatalism) are often missed.

  • Communications to Reduce Risk Underestimation and Overestimation

    by Peter M. Sandman, Neil D. Weinstein, and William K. Hallman

    Risk Decision and Policy 3 (2), 93–108 (1998)

    The experiment reported in this article deals with ways of depicting risk when you’re trying to get people to realize how serious the risk is ... or how serious it isn’t. In other words, how do you explain risk data so your audience will neither underestimate nor overestimate seriousness? The study shows some strategies that help, even in the face of outrage. The study also documents – for readers who need it documented – that outrage does make people consider a risk more serious.

  • Scared stiff – or scared into action

    by Peter M. Sandman and JoAnn M. Valenti

    Bulletin of the Atomic Scientists, Jan. 1986, pp. 12–16

    This 1986 article aimed at helping peace activists develop communication strategies that wouldn’t deepen people’s “psychic numbing” about nuclear weapons. Though its political content is out of date, its prescription – anger, love, hope, and action – is relevant today to coping with public denial about terrorism. (For terrorism I would want to add to the prescription the need to acknowledge and share the underlying fears – what people are “really” afraid of.)

  • Medicine and Mass Communication: An Agenda for Physicians

    Annals of Internal Medicine 85:378–383, 1976

    This discussion of how the media influence their audience and how doctors can influence the media was written 25 years ago – before managed care and before cable. Despite the outdated specifics about both medicine and media, the principles have held up well. This is still a pretty good primer on media impact and how to horn in.


Handout Set


On Environmental Activism

  • Editorial: Standing on the edge of time

    by Milan Rai and Emily Johns

    Published in Peace News, June 2014

    This short editorial from “the newspaper for the UK grassroots peace and justice movement” draws heavily on “Scared stiff – or scared into action,” an article I coauthored with JoAnn M. Valenti and published in the January 1986 issue of the Bulletin of the Atomic Scientists. That article focused on the ways in which activists opposing nuclear weapons might be frightening people into numbness, paralysis, or denial instead of inspiring them to join the cause. The Rai and Johns editorial applies some of our points to climate change activism, noting that “the disciplines of risk communication and disaster psychology may help us [climate change campaigners] to think new thoughts, and find new ways forward.” I have also written about this connection, in a 2009 column on “Climate Change Risk Communication: The Problem of Psychological Denial.”

  • Despite near certainty in new UN report, a climate of denial persists
    (Note: Link goes off-site to a page with a link to this 5-min. audio.)

    Interview with Peter M. Sandman by Marco Werman, aired on “The World” on PRI (Public Radio International) and posted on its website, September 27, 2013

    When the Intergovernmental Panel on Climate Change released its new report – claiming more certainty than ever before that the global warming threat is dire – Marco Werman of PRI’s “The World” interviewed me about why I thought many people might find the report’s conclusions hard to accept, and might go into a kind of psychological denial instead. The interview lasted about ten minutes, but was cut to less than five for airing. I made too many minor points that got used, albeit in abbreviated form. So my main point got almost completely lost – that climate change activists were their own worst enemies because they kept saying things that were likely to provoke or deepen people’s denial instead of things that could help people overcome their denial. For example, I told Marco, too many environmentalists were greeting the IPCC’s bad news triumphantly, almost gleefully – sounding more pleased that they were being proved right than devastated that the world’s in deep trouble. People who like their SUVs and are having a hard time accepting that they may have to give up their SUVs (that’s a kind of denial) may just barely be able to believe it if a fellow SUV fan sadly tells them so. They’re not about to believe it if it’s exultantly announced by someone who has hated the internal combustion engine since before global climate change was even an issue. For several better explanations of my thinking about climate change denial, see any of the other entries with “climate” and/or “denial” in their titles in the “On Environmental Activism” section of my Precaution Advocacy index.

  • link goes to a page where you will find the MP3Climate Change Risk Communication: Outrage Management, Not Just Precaution Advocacy
    Link launches an on-site audio file (17MB, 48 min.)

    Taped for Freakonomics Radio, July 25, 2011

    This was a 48-minute telephone interview with Stephen Dubner, for a Freakonomics Radio program (and podcast) on climate change. The interview never made it into the program/podcast, but excerpts were added to the Freakonomics website on November 29, 2011. The first 17 minutes of the interview are generic – Risk Communication 101, basically. The rest is grounded mostly in my 2009 column on “Climate Change Risk Communication: The Problem of Psychological Denial,” though Dubner periodically pushed me to speculate on new aspects of the topic. My main argument: Climate change risk communicators are good at informing and scaring apathetic people, but need an entirely different strategy – something more like outrage management – for people who are in denial about climate change.

  • Climate Change Risk Communication Dialogue

    by Stephen L. Brown, Peter M. Sandman, Steve Long, Betty K. Jensen, and Ron Law

    Excerpts from the RISKANAL listserv, March 24–25, 2009 (plus some follow-up offline correspondence)

    In late March of 2009, discussion on the RISKANAL (risk analysis) listserv turned to the psychology of people – including people on the listserv – who are skeptical about global climate change. I had recently dealt with this question in a column for this website on “Climate Change Risk Communication: The Problem of Psychological Denial.” So I posted a comment on the listserv referencing and summarizing the column. The resulting brief dialogue dealt with the motives not just of global warming skeptics but also of global warming supporters. And it led to a further discussion of whether strategic persuasion (on behalf of global warming or any topic) is antithetical to sincerity. I thought it was a good, thoughtful and respectful discussion – worth reprinting here (with the permission of all the participants). After the RISKANAL discussion petered out, I continued to exchange emails (also posted here) with one participant in the dialogue, Stephen L. Brown. Our focus slowly shifted from climate change risk communication to outrage and outrage management – and led to some observations on Steve’s part about outrage that I think are well worth reading, whether you’re interested in global warming or not.

    To join RISKANAL, send the following email message to lyris@lyris.pnl.gov:
    SUBSCRIBE RISKANAL First_Name Last_Name

  • Climate Change Risk Communication: The Problem of Psychological Denial

    Website column

    Posted: February 11, 2009

    Arousing apathetic people to care enough about global warming that they’re actually willing to do something about it is a difficult precaution advocacy challenge. Activists are chipping away at that task with slow but significant success. But there’s another audience for climate change risk communication that I think activists aren’t paying nearly enough attention to: people who are in denial about the crisis because it threatens the way they see the world or because it arouses intolerable levels of fear, guilt, sadness, hopelessness, or other emotions. For people in or near denial, outrage is high, not low; the risk communication paradigm is crisis communication, not precaution advocacy. This long column builds a case that global warming denial is a growing problem, and that messaging designed to work on apathetic audiences can easily backfire on audiences in denial. The column focuses on six common activist messages that need to be rethought in terms of their likely negative impact on people who are in or near global warming denial: fear-mongering; guilt-tripping; excessive hostility to narrow technological solutions; unwillingness to pay attention to climate change adaptation; over-reliance on depressing information and imagery; and one-sided contempt for contrarian arguments.

  • Broadcast on PRI’s “The World,” November 21, 2008

    Radio reporter Jason Margolis of “The World” attended a conference of global climate change skeptics, decided they were more deniers than actual skeptics, and ended up with a 10-minute story on climate change denial. I was one of several experts he quoted to explore the reasons why so many people have trouble facing the threat of global warming. In our interview, I focused on some ways activist communications may unwittingly encourage audience denial. Jason used the part on guilt – on why telling people their lifestyle is destroying the earth may not be the best way to inspire them to action. My views are elaborated further in a 2009 column on “Climate Change Risk Communication: The Problem of Psychological Denial.”

  • Chapter 11, “Media Campaigns” in Environmental Education & Communication for a Sustainable World

    Edited by Brian A. Day and Martha C. Monroe

    Published by the Academy for Educational Development, 2000

    Brian Day was my graduate student before going on to do communications for Environmental Defense Fund, GreenCOM, and other environmental advocacy efforts. In this chapter from a book he co-edited with Martha Monroe, Brian outlines a persuasion theory I taught him back in the 1970s – an approach to precaution advocacy that uses both an information-based component and a need-based component.


On Radon Testing and Mitigation

  • Experimental Evidence for Stages of Health Behavior Change: The Precaution Adoption Process Model Applied to Home Radon Testing

    by Neil D. Weinstein, Judith E. Lyon, Peter M. Sandman, and Cara L. Cuite

    Health Psychology, 1998, Vol 17. No. 5, pp. 445–453

    This is one of two articles I have posted dealing with the Precaution Adoption Process Model, developed mostly by Neil Weinstein and tested by Neil and me (and colleagues) using radon as the test case. The other article, A Model of the Precaution Adoption Process: Evidence From Home Radon Testing, is statistically heavier going and methodologically less rigorous, but covers more ground: It says more about how people decide to test their homes for radon, and contains a more detailed description of the model itself. This one has more convincing evidence that people decide to take precautions – in this case to test for radon – in stages, and that different interventions work best at different stages. See also The Precaution Adoption Process Model,” link is to a PDF file a 2008 book chapter that overviews the PAPM more generally.

  • A Model of the Precaution Adoption Process: Evidence From Home Radon Testing

    by Neil D. Weinstein and Peter M. Sandman

    Health Psychology, 1992, 11(3), pp. 170–180

    For about a decade, Neil Weinstein and I (with colleagues) did research on radon – a high-hazard low-outrage risk that first became important in the mid-1980s. This article uses several of our radon data sets to illustrate Neil’s Precaution Adoption Process Model (PAPM). The PAPM is one of several contending models of how people actually decide whether or not to protect themselves from risks. Different models lead to different interventions, so the competition over which model best explains people’s behavior is important for those trying to persuade publics to take precautions about serious hazards.

    (I’ve also posted another article, Experimental Evidence for Stages of Health Behavior Change: The Precaution Adoption Process Model Applied to Home Radon Testing, on the PAPM and radon, this one reporting a later experiment demonstrating that the decision to test does happen in separate stages. See also The Precaution Adoption Process Model,” link is to a PDF file a 2008 book chapter that overviews the PAPM more generally.)

  • Promoting Remedial Response to the Risk of Radon: Are Information Campaigns Enough?

    Science, Technology, & Human Values, Vol. 14 No. 4, Autumn 1989, pp. 360–379

    Most of the research on radon risk response – mine as well as others’ – has focused on how to persuade people to test. This article shows that even after people have tested and found a high radon level, persuading them to do something about the problem isn’t easy ... and mere information isn’t what does the trick. When this research was done, radon was a new issue; the findings reported here may be more useful for those working on other new issues than for those working on the now-familiar radon problem.


On Employee Safety

  • Strategic Safety Communication: The GAAMM Model to Inform People about Serious Risks

    Website column

    Posted: October 16, 2015

    This column outlines a precaution advocacy message design strategy I have been teaching and using since the 1970s: GAAMM. Instead of starting by deciding what you want to say, you start with what you want to accomplish, your goals. Then you figure out whom you need to reach to achieve your goals; those are your audiences. Then comes the most complicated step: deciding what preexisting appeals you can harness to steer your audiences toward your goals. (Preexisting barriers are also worth considering, but they’re usually secondary.) Then you can choose media and messengers that are compatible with your audiences and appeals. Finally, based on your appeals, media, and messengers, you can draft your messages. As the column keeps stressing, GAAMM is a model that works only for precaution advocacy – trying to arouse concern in apathetic people. The strategy for reducing concern in overly upset people is completely different.

  • Warning, or False Alarm: Why Safety Professionals See Near Misses Differently than Everybody Else

    Website column

    Posted: January 4, 2015

    When you do something that might have caused an accident but doesn’t, there are two ways to interpret the near miss: as a warning that you should be more careful, or as evidence that the behavior in question is actually pretty safe. Safety professionals tend toward the first interpretation; everybody else favors the second. This column discusses several factors that affect near miss perception: hindsight bias, the gambler’s fallacy, learned overconfidence, “resilient near misses” versus “vulnerable near misses,” and vividness. It hypothesizes that a crucial distinction is whether you know the behavior in question often causes an accident (so you see the near miss as a warning) or you don’t know how dangerous the behavior is (so you see the near miss as evidence the risk is low). The column ends with advice for safety communicators trying to use near misses as warnings.

  • Seven Sources of Your Safety Problem:  Where Does Risk Communication Fit?

    Website column

    Posted: September 5, 2012

    I started my career doing what I now call “precaution advocacy,” helping activist groups figure out how to arouse environmental concern in apathetic publics. But the bulk of my corporate consulting over the past 40 years has focused instead on “outrage management,” figuring out how to calm people who are excessively concerned. Most of my outrage management clients have never asked my advice on precaution advocacy, not even in its most obviously relevant manifestation: improving worker safety. This column addresses seven sorts of safety problems, and outlines where I think risk communication can help with each. I’m particularly interested in the last of the seven: whether my signature concept of outrage might conceivably be the next new thing in occupational safety.

  • Recession Risk Communication: How to Focus on Safety When Employees Are Demoralized

    Website column

    Posted: April 19, 2009

    Tough economic times are tough on safety. Workers may be distracted or distressed, while safety budgets (like all budgets) may be reduced. Tough economic times are also tough on safety controversies. Not only do workers have real reasons to suspect that they might be more endangered than usual; they also have less patience and forbearance, and perhaps more motivation to project their economic worries onto an on-the-job safety situation. This short column for industrial hygienists offers some tips on ways to adjust safety risk communication when the economic situation is bad.

  • The Boss’s Outrage (Part I): Talking with Top Management about Safety

    Website column

    Posted: January 7, 2007

    I have long been interested in why corporate managements reject safety improvements that look eminently cost-effective – in some cases, improvements that have a better return-on-investment than the company’s principal product line. This short column explores some outrage-grounded reasons why senior managers might shy away from sensible safety investments. Among them: guilt/responsibility, ego/stature, hostility/contempt, fear/denial, and performance anxiety. The column suggests some ways safety professionals can break the logjam when factors like these are keeping their companies from making safety progress.

  • Selling Safety: Business Case or Values Case

    Published in The Synergist, December 2005, pp. 30–35

    I have long argued that corporate environmental performance is better “sold” to stakeholders as a response to pressure than as a self-motivated commitment to the environment; I think claiming to be responsive is both truer and more credible than claiming to be responsible. In this article I make the same case about “selling” safety to employees. When management says it cares more about safety than productivity or profit, I argue, employees are likely to conclude that safety rules have more to do with company PR than company policy, and may “loyally” rather than rebelliously disobey. The article also discusses why both safety professionals and top corporate managers enjoy making a values case for safety, and resist making the business case I think they should make.

  • Getting Workers to Wear PPE: Communication Is Key

    by Jennifer Busick, MPH

    Published in Safety Compliance Letter, September 2005, pp. 7, 10

    Employees may resist wearing personal protective equipment (PPE) for all sorts of reasons: It’s uncomfortable; it interferes with productivity; it’s not the macho thing to do; management doesn’t really mean it; the safety person’s warnings sound a lot like my mother. This article discusses some of my ideas about how to be convincing in the face of these reasons.

  • by Dave Johnson

    ISHN E-News, July–September, 2003

    This three-part interview was published in ISHN E-News. Excerpts were also published in the September 2003 issue of ISHN (Industrial Safety & Hygiene News) – the paper version, which is also on-line – under the title “Charting your course: 25 keys to safety success – Advice from Dan Petersen, Peter Sandman & John Henshaw.”

  • Beyond Duct Tape

    by Dave Johnson

    Published in ISHN (Industrial Safety & Hygiene News), March 28, 2003

    Jody Lanard and I wrote “Duct Tape Risk Communication” to analyze the weird public response to the U.S. Government advice to stockpile duct tape for use against some kinds of terrorist attacks. Dave Johnson saw an analogy to the weird way employees sometimes respond to safety messaging, and went with it.

  • Motivated Inattention and Safety Management

    Published in safety AT WORK, 30 October 2001

    This interview focuses on the “other side” of risk communication – how to persuade people to take risk more seriously. It deals mostly with two problems: employees who ignore safety procedures even though they have been well trained, and employers who ignore safety opportunities even though they are cost-effective. Both problems have their roots in outrage.


On Infectious Diseases and
Pandemic Preparedness

  • Flu Preparedness: An Even Tougher Sell than Usual

    Website column

    Posted: September 9, 2009

    I wrote this short column in early June 2009 for The Synergist, a magazine for industrial hygienists, on some ways of communicating about flu – seasonal and pandemic – in the workplace. When I wrote it, most people had “recovered” from what they considered the spring “swine flu scare,” and they were in no mood to listen to any more influenza warnings. By the time the column was published in September 2009, some of the complacency had waned, and people were actually girding up (a bit) for another wave of mild-but-pervasive pandemic illness. So the column’s claim that flu preparedness is a tough sell needs to be modified somewhat. But its actual recommendations still stand.

  • Convincing Health Care Workers to Get a Flu Shot … Without the Hype

    Website column by Peter M. Sandman and Jody Lanard

    Posted: January 10, 2009

    Convincing health care workers to get a flu shot might normally be seen as a straightforward problem in precaution advocacy, but this column focuses on an aspect of the problem that’s grounded in outrage management: flu protection hype. By means of three case studies, Jody Lanard and I document that hype – misleading, one-sided messaging on behalf of vaccination and other flu precautions – does in fact characterize much of what’s produced by flu prevention campaigners. We also argue, with much less evidence, that the hype leads health care workers to mistrust what the campaigners are telling them, and that the mistrust probably reduces their willingness to get vaccinated. The column ends with a list of less tendentious recommendations for convincing health care workers to get a flu shot.

  • The Fear Factor: Preparing the public for a major disaster like pandemic flu without inciting panic is tricky. But the truth goes a long way.

    by Nancy Shute

    Published in U.S. News and World Report, November 21, 2005; online November 13, 2005

    This is an excellent summary of the dilemma authorities face when trying to alert the public to the risk of pandemic flu – a risk that could be severe or mild, imminent or far into the future. Despite its title, the article does point out that the risk of inciting panic isn’t a major problem, although the (unjustified) fear of inciting panic is. It offers justified praise to the U.S. government and the World Health Organization for their increasing willingness to sound the alarm.

  • Bird Flu: Communicating the Risk link is to a PDF

    by Peter M. Sandman and Jody Lanard

    Published in Perspectives in Health (Pan American Health Organization), vol. 10, no. 2, 2005, pp. 2–9

    PAHO asked us to combine a primer on risk communication with a primer on avian influenza. The resulting article talks about the challenge of alerting the public to bird flu risks, then offers ten risk communication principles, each illustrated with bird flu examples. The PDF file also includes the cover, an editor’s note entitled “Communication: risky business,” and the contents page. (Note the confusion of “bird flu” with pandemic flu in this 2005 article – and this blurb, also written in 2005.)

    (There is an online version (same text, but easier to read than a PDF file) posted on the PAHO website. The entire issue is also there.

    Spanish translation available

    La gripe aviar: cómo comunicar el riesgo link is to a PDF

  • Pandemic Influenza Risk Communication: The Teachable Moment

    Website column by Peter M. Sandman and Jody Lanard

    Posted: December 4, 2004

    This is the first column Jody Lanard and I wrote about pandemic preparedness. We wrote it when many experts believed a devastating H5N1 flu pandemic might be just around the corner – and so we thought so too. (We still think the risk is serious, but there’s much less sense of imminence as I write this blurb in mid-2008.) The thrust of this long column is how to sound the alarm. After a primer on why H5N1 is “not your garden variety flu,” the column proposes a list of pre-crisis pandemic talking points. Then it assesses how well experts and officials were addressing those points as of late 2004. The experts, we wrote, were doing their best to arouse the public. But governments and international agencies were undermining the sense of urgency with grossly over-optimistic claims about pharmaceutical solutions.


Selected Guestbook
Comments and Responses


Copyright © 2020 by Peter M. Sandman

      Comment or Ask      Read the comments
Contact information page:   Peter M. Sandman     


Website design and management provided by SnowTao Editing Services.