A Blind Spot for Bad Guys (Peter Sandman column)
Posted: June 16, 2005
This page is categorized as:    link to Crisis Communication index  link to Pandemic and Other Infectious Diseases index
Hover here for
Article Summary This column argues that western society has a blind spot for bad guys – that our vision of an actionable emergency is an accident, not an attack. It discusses several examples, from the resistance to evidence that the 1984 Bhopal “accident“ was probably sabotage to the opposition of the U.S. public health profession to the possibility that smallpox might constitute a weapon of mass destruction that could justify a vaccination program. The best example – detailed in the column – happened in April 2005, when it was learned that an infectious disease testing company had mistakenly sent samples of a potentially pandemic strain of influenza to labs all over the world. So a fax went out to all the labs telling them so, and asking them to destroy the sample – thus converting a small accident risk into a much larger terrorism risk. The facts were public at the time, but a society with a blind spot for bad guys simply ignored their implications.

A Blind Spot for Bad Guys

The most serious industrial accident in history – some 2,000 dead, up to 300,000 injured – probably wasn’t an accident. There is no definitive proof the December 1984 Union Carbide disaster in Bhopal, India, was caused by sabotage; certainly no disaffected employee has ever been charged with the crime. But as the company argued for years, what happened at Carbide’s factory was either a really weird confluence of independent human errors and equipment malfunctions (and maybe some money-saving corner-cutting) or pretty much what a well-trained worker would do if s/he wanted to ruin a batch of methyl isocyanate in the most dramatic fashion possible. In logic, the likeliest explanation is the one that requires the fewest assumptions, and in the case of Bhopal that’s sabotage.

Whatever the truth about Bhopal, what’s interesting is the nearly universal rejection of Carbide’s sabotage hypothesis. A Google search today for “Bhopal Carbide accident” yields 41,100 hits, compared to only 18,200 for “Bhopal Carbide sabotage.” From the outset, journalists and the public found two scenarios believable: that Bhopal was simply an accident or that it was a stunning example of corporate malfeasance. Nobody did it or Carbide did it. The possibility that a bad guy did it ran a very distant third.

Of course Union Carbide’s management didn’t help matters by implying that if an angry employee or former employee was responsible, then the company wasn’t – as if companies were under no particular obligation to keep their factories from being used as weapons of mass destruction. Since Carbide seemed to think that sabotage would let the company off the hook, activists (and plaintiffs’ attorneys) felt obliged to say it couldn’t possibly be sabotage. And just about everybody found that easy to accept.

When Outrage Is a Hazard

It isn’t just activists, journalists, and the public that tend to ignore the bad guy scenario. Companies do too. In my presentations to corporate environment, health, and safety (EH&S) groups, I often ask how many people have heard that the Bhopal disaster might have been sabotage. I get a smattering of hands. And when I ask what priority EH&S management is giving to sabotage prevention, I get mostly blank stares. In 1995 I published a short article on this topic in The Synergist, the monthly journal of the American Industrial Hygiene Association. Entitled “When Outrage Is a Hazard,” the article argued that unhappy, angry, or crazy employees pose a significant risk to industrial facilities and their neighbors. And the problem is getting worse, I said:

A company that carefully trains its workers in safety procedures may also be teaching them how to wreak havoc: “Don’t ever turn valve X and engage switch Y at the same time or else.…” The same company may be downsizing aggressively and letting morale decay.

At some point, the depressed employee becomes as likely a worst-case scenario as the malfunctioning apparatus – and “lean and mean” takes on an ominous sound.

Attention increased a little after 9/11 – but only a little. Most facilities can make a credible case that they’re unlikely to be high on Al Qaeda’s list. They don’t seem to be thinking much about an Al Qaeda wannabe on the night shift (or maybe just an angry guy whose wife has left him, whose boss has dissed him, and who has poor impulse control). Of course it’s possible that corporate managers are quietly thinking about sabotage and planning to prevent it – reframing morale as a safety issue, making sure the human relations people and the EH&S people talk to each other, telling supervisors to temporarily reassign a worker in a high-risk job who’s muttering over his lunch that he’ll get even with those SOBs. But I’ve seen no sign of it, and I don’t believe it’s happening.

Nor are regulators making it happen. One of the most effective environmental regulations in the United States today is the Environmental Protection Agency’s Risk Management Plan (RMP). The RMP rule requires facilities that manufacture or use dangerous chemicals to figure out how they could kill the largest number of their neighbors – and then call in the neighbors periodically to tell them about it. There is no requirement to reduce inventories or improve precautions, but the obligation to reveal their worst case scenarios puts pressure on companies to find ways to mitigate them. RMP, unfortunately, explicitly exempts sabotage, terrorism, and anything intentional. It applies only to accidents.

In May 2004, I spoke at an EPA-sponsored national conference on water security risk communication. As the conference proceeded, it became clearer and clearer that water utilities had no intention of warning their customers how vulnerable the water supply is to sabotage, and very little intention of making a serious stab at prevention. (There was a lot of interest in improving emergency response capability, however.) I proposed that amending the RMP regulation to remove the exemption for intentional acts would do a lot to build a fire under the utilities. The EPA official who organized the conference took the suggestion back to Washington. In May 2005 he wrote me that he couldn’t find anyone at EPA interested in considering this change.

A Public Health Miscalculation: Smallpox Vaccination

The best examples of a blind spot for bad guys come from the public health profession. Public health people are wonderful people, by and large – earnest, dedicated and idealistic. But they do tend to see the public as patients.

One characteristic of patients, in the mind of the typical public health practitioner, is passivity. People aren’t supposed to have their own opinions about how to protect their health. They are supposed to do (and feel) nothing until they’re told what to do (and feel), and then they’re supposed to do (and feel) it, period. And so it’s hard to persuade health departments to treat the public as a partner and decision-maker – to consult widely about their plans for health emergencies, to make provision for the thousands of would-be volunteers who show up during such emergencies, to offer people choices and thus a feeling of control, etc. And when people do take any kind of autonomous action (such as seeking a Cipro prescription during the 2001 anthrax attacks, or wearing a mask during the 2003 SARS outbreaks, or standing in line for a flu shot during the 2004–2005 vaccine shortfall), the public health establishment is likely to characterize their initiative as panic.

Another characteristic of patients, apparently, is goodness. Most public health professionals seem deeply committed to a benign (passive, but benign) vision of the public. There are no bad guys. There are only prospective victims in need of help.

This first became clear to me during the 2002–2003 smallpox vaccination program. In January 2003, well into the program’s planning but before the start of its implementation, I wrote a column on “Public Health Outrage and Smallpox Vaccination.” I argued that nearly all the public health professionals I met were viscerally opposed to the program they had been charged with carrying out. Unless something was done about their outrage, I said, they would find a way to undermine it. The program’s failure to come anywhere near the target number of vaccinations has since been thoroughly evaluated by the Institute of Medicine. (See http://www.nap.edu/books/0309095921/html/ for the IOM assessment.) The IOM found many problems related to inadequate communication, distrust, and lack of buy-in by the public health professionals tasked with implementing the program. Maybe the program would have failed even if public health people had wanted it to succeed. But it can’t be irrelevant that they wanted it to fail, and they got what they wanted.

Why did they want it to fail? There were many reasons, from resentment that President Bush had rejected their counsel to injured self-esteem that public health’s most shining accomplishment, the worldwide elimination of smallpox, was under challenge; from reluctance to divert resources from other programs to fear that a smallpox vaccination controversy might compromise other vaccination efforts. But the most fundamental reason why nearly everyone in public health opposed the vaccination program was simply that they considered the threat of the smallpox vaccine more serious than the threat of a smallpox attack.

As vaccines go, the smallpox vaccine is a risky one. Serious side effects ( “adverse events” in public health lingo) occur roughly 15 times for every million inoculations. There was some question about this number before the program was launched, but the experts’ estimates of vaccination risk didn’t actually vary very widely. What varied widely were the estimates of the risk of a smallpox attack. Intelligence professionals, citing evidence they said was secret, argued that a smallpox attack was a real possibility, well worth preparing for by vaccinating at least health care workers, emergency responders, and soldiers – if not the rest of us as well. Public health professionals, citing no data whatever, disagreed. They just knew that smallpox had been successfully eradicated. The intelligence “chatter” about possible remnants from prior germ warfare programs getting into the hands of terrorists, they said, wasn’t real data; it certainly hadn’t been published in peer-reviewed journals. This wasn’t so much a difference of opinion as a conflict of cultures: the paranoid worldview of spies versus the Pollyanna worldview of public health people.

Run the numbers. Smallpox kills about 1-in-3 of the people it infects. The smallpox vaccine injures or kills about 15-in-a-million of the people who roll up their sleeves. So it’s 15-in-a-million versus 333,333-in-a-million – a factor of 22,222. Round off to 20,000. If you think you stand less than one chance in 20,000 of getting smallpox, you’re better off without the vaccine. If you think the odds are greater than one in 20,000, you’re better off with the vaccine (especially if you’re a bit skeptical about standing in line for a shot after the attack begins). You don’t have to believe a smallpox attack is likely to think vaccination is a wise investment. You just have to believe it’s not impossible.

And that’s if you’re simply calculating the personal benefit versus the personal risk of vaccination. Presumably many health care workers and emergency responders want to be ready to help in the event of an attack – ready to vaccinate others, for example. For the inoculation to make sense to them, they don’t need to think there’s more than a 1-in-20,000 chance of their getting smallpox themselves; they only need to think there’s more than a 1-in-20,000 chance of a smallpox attack nearby.

Imagining that a low probability is a zero probability is a common mistake in the assessment of unlikely but horrific scenarios. It’s hard to get people to buy insurance against hundred-year floods. But you’d expect public health professionals to do better. They routinely calculate odds ratios to decide whether a medication is likely to do more good than harm. What threw them off this time? More than anything else, I think, it was their blind spot for bad guys.

Another Public Health Miscalculation: H2N2

I have saved the best example for last.

The announcement came in early April 2005. A U.S. company, Meridian Biosciences, had been hired by the College of American Pathologists (CAP) to provide samples for a proficiency test kit, used to assess whether medical lab workers know how to test for influenza A (but not how to subtype it). Somehow Meridian or its supplier picked H2N2, the flu strain that caused the 1957 pandemic, as one of its “unknowns,” labeled only with batch numbers like “Specimen VR1-05.” The result: Tucked into refrigerators in thousands of labs all over the world were freeze-dried samples of a flu strain that disappeared from natural circulation in the 1960s, a strain to which people under 40 have little if any immunity. A simple accident in any of those labs could have launched another influenza pandemic.

You may have read about the bird flu virus H5N1, now killing millions of birds and dozens of humans in Asia; H5N1 has the experts worried because if the virus learns how to spread efficiently from human to human, it could easily ignite a worldwide pandemic. H2N2 is in principle more dangerous still; unlike H5N1, it already does human-to-human transmission just fine. At least it did in 1957. Some experts hypothesize that the virus, if it has been propagating for several decades in eggs or cell cultures (the laboratory growth media for influenza), may have mutated, getting better and better at living in laboratory cultures, and worse and worse at living in people. So maybe lab samples of H2N2 have lost much of their pandemic potential. Or maybe not.

Laboratory accidents aren’t rare. In fact, it took one to discover the risk of a “Meridian Flu” pandemic – a hospital lab in Vancouver, Canada somehow cross-contaminated the Meridian sample with a patient’s sample, launching the investigation that uncovered the problem. Until then, for months, the world may have been one minor accident away from infecting a lab worker with H2N2; if that worker got the flu and gave it to friends and coworkers and people sitting nearby on the bus, an accidental influenza pandemic wasn’t all that unlikely. There’s precedent. The H1N1 influenza strain that caused the low-mortality 1977 “Russian Flu” epidemic was virtually identical to the 1950 H1N1 strain, which was no longer circulating by the 1970s. Most experts believe the 1977 epidemic arose from an accidental laboratory release somewhere in Asia.

Until thousands of labs were told about Meridian’s mistake on April 8, the risk of intentional H2N2 release was tiny; nobody or almost nobody in those thousands of labs knew they had a potential weapon of mass destruction on the premises. But on April 8 – late Friday afternoon in the U.S.; early Saturday morning in half the world – CAP sent a fax to all the labs on its list. Headlined “URGENT,” it warned them about the problem, told them the identification number on the labels of the dangerous samples, and urged them to find the H2N2 and destroy it. CAP’s fax drew a bulls-eye around the H2N2 samples.

That’s when the accident risk went down, and the terrorism risk went way up. All weekend long, in many cases, this URGENT fax sat in the in boxes of roughly four thousand hospital and other medical lab fax machines, most of them in North America but some in places like Lebanon and Saudi Arabia. Ask yourself who might have passed by one of the machines and read the instructions on how to obtain their very own weapon of mass destruction. Taking a subsample to culture before destroying the CAP sample would be technically simple for an experienced lab technician. Pocketing a sample in some ice would be even simpler. Who on earth would want to do that? A laboratory employee somewhere who is affiliated with a terrorist organization, or who wishes s/he were. A mentally disturbed person. Someone desperate for cash, looking for a chance to sell the sample to the highest bidder. Someone with a radical political ideology or a burning grievance. There’s no reason to believe that happened, and no way to prove it didn’t.

Among the recipients of the CAP fax was the Colorado Department of Public Health & Environment, which operates the Colorado Laboratory Forum Web Portal. Within hours, the fax was posted on the Portal’s login page (the part that doesn’t require a password), with a “scrolling marquee” to draw attention to its urgency. My wife and colleague Dr. Jody Lanard asked Suzanne Jackson of the Department about the decision to put it there, or to post it at all. Ms. Jackson wrote back:

The Director of Infectious Diseases and Preparedness at the Association of Public Health Laboratories contacted all Public Health Laboratory Directors late on Friday … with this information and the recommendation that we contact the laboratories in our jurisdiction.

Our PHL director sent the message to our chief medical officer (I received a copy of the message) with a request to alert/inform possible clinical laboratories within our state. I forwarded the message to our Health Alert Network (HAN) for statewide distribution to the clinical lab audience and I updated the Colorado Laboratory Forum Web Portal page as well….

I designed and maintain the Colorado Laboratory Forum Web Portal site to serve laboratories in our state for public health preparedness. Our members have Internet access 24/7 and are encouraged to keep this web portal login page open 24/7 for any announcements on the scrolling marquee.

In other words, lab employees in Colorado didn’t have to walk past the fax machine to see the CAP announcement; walking past the computer would have done the job. (The fax is no longer on the Portal login page, but Jody copied it while it still was. See thebox at the end of this column.) Suzanne Jackson’s title, by the way, is Bioterrorism Preparedness and State Training Coordinator. Even the Colorado Health Department’s bioterrorism preparedness specialist, it would seem, had a blind spot for bad guys.

At least one lab director in Ohio was more skeptical – but not about possible evildoers in his lab. He told me the fax arrived on plain paper – no letterhead, nothing to authenticate it. He suspected a hoax and took no action until he saw some media coverage.

The Colorado Health Department made its own decisions, but CAP did not act alone. The U.S. Centers for Disease Control and Prevention approved the fax. Health Canada and the World Health Organization took part in the planning. Clearly they were rightly worried about the continuing risk of a lab accident. Just as clearly, they weren’t at all worried about telling thousands of miscellaneous lab workers on weekend shifts what they had on the premises.

I have no idea if anyone contacted the Department of Homeland Security. I have to guess not, or somebody would have proposed a different protocol for retrieving the samples. One reasonable possibility: Schedule a conference call. Ask the lab manager and two or three employees to stand by at a specified time, without telling them why. Once they’re on the call, explain the situation. Instruct them to stay together and on the phone as they locate and autoclave the sample.

Presumably DHS found out after the fact, and presumably it is doing what it can do post-hoc to minimize the risk – cross-tabulating laboratory rosters with terrorism files, for example. (Of course that won’t help locate labs with disgruntled or disturbed employees.) But maybe I’m wrong. At meetings of terrorism and bioterrorism professionals, concern has invariably focused on comparatively rare organisms like anthrax, smallpox, tularemia, and plague. It’s hard to arouse much interest in flu. So maybe the health profession’s blind spot for bad guys is matched by the counter-terrorism profession’s blind spot for influenza.

Amazingly, the media seem to share these blind spots. All the facts about the H2N2 fax were public from the outset. The story was fairly big for a few days, but reporters focused on Meridian’s mistake, on the laws and policies that permitted that mistake, and on the progress toward getting all the samples destroyed – that is, progress toward getting faxes from the labs claiming they had destroyed the samples.

Reporters also focused on their sources’ endless reassurances that lab errors are rare. Actually, the H2N2 story includes at least the following errors, aside from the recall protocol that ignored the existence of bad guys:

number 1
First, Meridian selected H2N2 (knowingly or unknowingly – two different possible errors).
number 2
Then Meridian sent the H2N2 strains to Canadian biosafety level 2 labs, which were not licensed for H2N2 (Canada requires that pandemic influenza strains be handled in level 3 labs).
number 3
Meridian listed the samples on Canadian customs forms as “H3N2” (an appropriate, nonpandemic strain of influenza), according to Canadian National Laboratory Director Frank Plummer.
number 4
And then the Vancouver hospital lab cross-contaminated the virus, leading to the discovery that a pandemic strain had been used in the test kits.

Despite all these errors, public health officials around the world repeatedly reassured the public that the risk of a pandemic had been virtually nil, because lab errors are so rare, and because the people who work in labs are so professional and well-trained. Actually, most published surveys of lab accidents and lab-acquired infections assert that they are far from rare; many, moreover, are thought to go unrecognized, or if recognized, unreported.

With all those news stories suggesting that the risk of a lab accident was negligible, it’s a wonder nobody noticed the other H2N2 risk, the one created in the process of reducing that “negligible” accident risk. I haven’t seen a single news story about the foolishness of the sample recovery protocol and the resulting unknown, unknowable risk that an H2N2 culture now resides in a terrorist’s or extortionist’s or nut’s refrigerator.

I also haven’t seen any indication that health authorities are even thinking about manufacturing and stockpiling some H2 antigen for a flu vaccine, just in case there is an H2N2 attack. It’s that blind spot for bad guys again.

The H2N2 section of this column was written with the help of my wife and colleague Jody Lanard.

Here is the text of the CAP fax, as reproduced on Colorado’s Laboratory Forum Web Portal:

4-8-05: URGENT INFORMATION regarding Influenza in CAP Proficiency Testing Survey

Urgent Information regarding CAP Proficiency Testing Survey VR1-A 2005

For the VR1-A 2005 Survey, Specimen VR1-05 was identified as Influenza A. The strain used for this specimen was ATCC VR100. This is an influenza A strain of the H2N2 subtype. Viruses of this subtype have not been seen in the North American population since 1967, and as such are not addressed in the formulation of the current influenza vaccine approved by the U.S. Food and Drug Administration.

Because of this, we are asking that you IMMEDIATELY destroy any materials you may have retained or derived from this Survey.

If you have any questions, please contact the CAP Customer Contact Center at 1-800-323-4040, Option 1.

Copyright © 2005 by Peter M. Sandman

For more on crisis communication:    link to Crisis Communication index
For more on infectious diseases risk communication:    link to Pandemic and Other Infectious Diseases index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.