Sandman outrage software
It is a zipped compressed file. If you have a PowerPoint-capable viewer, you need download only the presentation. However, it has not been tested thoroughly on any version of Microsoft Windows subsequent to Microsoft Windows If you have any concerns, you should consult your System Administrator prior to installing this software.
An IT colleague has offered to try to produce an upgrade that will work on the newest versions of Windows if I can get him the original source code. Users of the software should read the Agreement carefully before installing the software, and must agree to be bound by the Agreement. If you do not wish to be bound by the Agreement, do not install the software. Users should also note that the software is made available free of charge, and as a result no maintenance or ongoing support will be provided.
Website design and management provided by SnowTao Editing Services. But please read the page first. Promotional Literature from Threats to reputation are more costly than they used to be, but they are also more avoidable. Acknowledge the accurate bits? This short column covers all that, but it also addresses a less sexy but ultimately more important topic: the importance of tracking down rumors that may be true. It then tries to assess the proper role of outrage management in public participation.
Even assuming your worried stakeholder is wrong about X, he or she may not be irrational — but rather mistrustful, postmodernist, cautious, uninformed, misinformed, intuitive, emotionally upset, motivated by personal or social values, or pursuing a different agenda. When we ignore these possibilities and assume our risk-averse stakeholders are irrational, the column suggests, we raise questions about our own rationality.
Month after month, this is one of the least often read of my major columns. This column dissects an issue — one of the few — on which I disagree with most risk communication and crisis communication professionals: what to do when there are differences of opinion within your organization.
I urge my clients to let the disagreements show. Perhaps most importantly, it details what tends to go wrong when organizations muzzle their staff in order to speak with one voice. This short column has two goals. This strategy is fundamental to both crisis communication and outrage management, but it is seldom utilized, largely because it threatens management egos. This column describes the battles that ensue when activists or journalists are trying to arouse stakeholder outrage about some situation while companies or agencies are trying to reduce that outrage.
Some of what goes on in these battles is symmetrical. Some of what goes on is not symmetrical. This short column considers the four possibilities when you are trying to convince me of X: I could have no prior opinion about X; I could believe X already; I could believe Y instead; or I could be ambivalent, torn between X and Y. Each of these four situations has its own risk communication game, described in the column: follow-the-leader, echo, donkey, and seesaw.
Good risk communicators need to master all four games. It was widely criticized for alarming people before it had solid evidence that the strain was spreading. It was widely criticized for the delay. Obviously, when to release risk information is a tough call. In this column, Jody Lanard and I lay out the pros and cons, and conclude that early is almost always better than late. We also analyze the New York City decision in detail, and offer some ways to reduce the downsides of early release.
This column is in two parts. Part Two goes into detail on the toughest part of acknowledging uncertainty: deciding just how uncertain you ought to sound, and then coming up with words or numbers that capture the right level of uncertainty.
It assesses five biases that tend to distort our judgments about how uncertain to sound, even after we have accepted the principle that we should acknowledge our uncertainty. Which of the two is likelier to get said when the other would have been closer to the truth? Most of this long column is addressed to risk communicators whose goal is to keep their audience unconcerned.
The column details their reluctance even to mention worst case scenarios, and their tendency when they finally get around to discussing them to do so over-reassuringly. It explains why this is unwise — why people especially outraged people tend to overreact to worst case scenarios when the available information is scanty or over-reassuring. Then the column lists 25 guidelines for explaining worst case scenarios properly. Finally, a postscript addresses the opposite problem.
How can you do that more effectively? Misleading toward the Truth: The U. Mad cow disease has never been a serious threat to human health in the United States. When it tries to convince people of this truth, the U. In this long column, Jody Lanard and I painstakingly dissect nine instances of misleading USDA mad cow risk communication in the wake of the December discovery of the first known mad cow in the U. Not that the USDA was unusually dishonest.
This sort of dishonesty is routine in risk communication, especially when its perpetrators know they are in the right. In the fall of I was commissioned by Vodafone Group Services Limited to think through and write up my opinion on the following question: Assume that a particular risk is probably not serious from a technical perspective, but some people are worried or upset.
Should governments impose more stringent precautions in such a situation then they would impose if people were calm or apathetic? The resulting essay turned out more nuanced than Vodafone probably expected. In general, I did reach the conclusion Vodafone was presumably looking for — that government precautions and government warnings are not reliable ways to reduce outrage, and probably should not be deployed for that purpose.
I found surprisingly little research on point, but lots of theoretically interesting arguments in both directions to dissect. There is a certain irony that the most thoughtful, tentative, balanced, academic writing I have done in years was done for a corporate client.
Which analytic scheme works best depends on the situation. One of the core outrage management recommendations on my shortlist is accountability. I see it both as a replacement for trust and as a step in the direction of sharing control.
The column starts by acknowledging that a legally ill-advised outrage management strategy can have disastrous legal repercussions. That said, it addresses a variety of reasons why most lawyers dislike outrage management even in situations where there are unlikely to be any legal ill effects.
After a section on what outrage management can offer the legal process — that is, how lawyers might actually benefit from paying attention to outrage issues — the column zeros in on five genuine areas of conflict between law and outrage management: ignorance, silence, candor, apology, and tone.
These are the areas where wise clients force their legal and communication advisors to find a middle path. When things go badly wrong for a company or government agency, there were usually precursors, and the failure to heed these warnings is a familiar feature of post-disaster recriminations.
The column focuses on the last of these choices, arguing that transparency about yellow flags is not just the best way to get them investigated properly; it is also the only way to prevent people from imagining afterwards that they were red flags.
If you want to know how apology and forgiveness work, ask a Catholic. Today, even medical malpractice lawyers routinely urge their clients to apologize.
They are comfortable correcting the problem and compensating the victims — which rarely does much good without the other, more humiliating steps. Whenever a company does something wrong, the public wants to know why.
The two contending explanations are stupidity and evil — you made a dumb mistake or you did it on purpose. Government agencies are different; people believe governments make stupid mistakes all the time. What follows from this reasoning is what I call the stupidity defense. As this column argues, when a company makes a stupid mistake, it needs to say so — early, often, and penitently.
Implementing Risk Communication: Overcoming the Barriers. This minute video, produced in , went out of print in January After a six-minute introduction, this video is devoted to three kinds of barriers to implementation … and ways to overcome them:.
This short article starts with the assumption that coercion is an unreliable way to site controversial facilities, and tries to offer some better answers grounded in risk communication. This manual on how to use risk comparisons and risk statistics was commissioned to help chemical plant managers explain air emissions to their neighbors. Chapter III on risk comparisons, especially, is still relevant. The other chapters are also useful and not really outdated, I think.
The appendices are both outdated and all too likely to be misused. Vincent Covello, Paul Slovic, and I wrote the rest of the manual to soften them. Postscript: Masks as Virtue-Signaling. I sent her a list of suggestions on whether and how to make patients wear masks. The version of the email posted here has been modestly revised. The column focuses on a specific example: labeling foods that contain genetically modified ingredients.
Usually the outrage reduction effect is stronger and longer-lasting than the hazard salience effect. And the available evidence suggests that this is indeed the case for GM food labels, which turn out more calming than alarming. The column then broadens the discussion to informed consent more generally.
Relying in part on the example of the Dengvaxia vaccine, it builds a case that it is wiser to provide potentially scary information about small risks than to withhold this information. Even when people overreact — that is, even when the hazard salience effect overwhelms the outrage reduction effect — the crucial need to build and sustain trust makes honesty nonetheless the best policy. Interview with Peter M. She said an industry client had recommended me.
I accepted. Along the way I told a few stories from my consulting without naming the clients, of course. Could It Happen Here? Your options: Duck the teachable moment and keep mum. Misuse the teachable moment by telling a one-sided, over-reassuring story. Or seize the teachable moment and launch a candid dialogue about the risk. This column concedes the several persuasive reasons for keeping mum, and then builds a case for talking and listening instead.
The same case applies to misbehaviors as well as to accidents; and to earlier times at your own facility as well as to similar facilities elsewhere. Terry Sim is editor of Sheep Central , an Australian online sheep industry news service. On May 19, , Sim posted an article about the strategic thinking of Marius Cuming, the corporate communications manager of trade group Australian Wool Innovation. The controversy specifically referenced was mulesing — removing strips of wool-bearing skin from around the buttocks of sheep in order to reduce the number of flies that lay their eggs in the urine- and feces-contaminated wool.
My reply email agreed with Cuming that fighting with critics is a losing proposition. Amalgam of two emails in response to a query from Ken Silverstein, March 4 and 5, Forbes decided not to use his story, but it was cached while briefly in the Forbes system.
Confirmation bias is our universal tendency to hang onto our beliefs in the face of evidence to the contrary. This column begins by describing the cognitive defenses that confirmation bias relies on: selective exposure, selective attention, selective perception, framing, selective interpretation, and selective retention.
The key is to avoid challenging the audience more than necessary by finding things sometimes even irrelevant ones to reinforce or agree with. The column closes with pointers on how to disagree when disagreeing is necessary. An article in the April 29, issue of The Atlantic focused on a study claiming that the average person is likelier to die in a mass extinction event than in a car accident.
On May 10 I emailed Faye this response. She wrote her story, but on May 17 the Bloomberg News editors decided not to run it, judging that the news peg — the Atlantic mass extinction article — was no longer of much interest to their readers.
Near the end starting at ; link provided on Vimeo I was asked about something completely different: risk communication challenges of the Zika epidemic. On March 10, , the top public relations executive of the massive J.
My brief response emphasized that the agency had to know whether the accusations were basically true or not, and that its strategy should depend on that. A fair amount of my email was included in a March 15 article Gianatasio coauthored with Patrick Coffee and Katie Richards.
I have posted the whole email. Posted on The Conversation , November 18, Simon Chapman is a public health professor at the University of Sydney in Australia. In , he and Sonia Wutzke published an analysis of Australian media coverage of the controversy over whether mobile telephone towers were a threat to health.
The poaching of a Zimbabwean lion by Minnesota dentist and recreational big-game hunter Walter Palmer provoked a powerful outburst of public outrage in late July Ultimately the newspaper decided not to do a story on why people were so outraged at Dr.
The editor of Public Utilities Fortnightly invited me to write an article — no pay, but any utilities-related topic I wanted, at any length I wanted, with the final edit up to me, and with the okay to post the final product on my website as soon as it came out.
But for a variety of reasons, hype from the coal industry backfires much more badly than hype from for example the solar or wind power industries. Jeremy Story Carter interviewed me on September 3, , halfway through a three-day risk communication seminar I ran in Melbourne, Australia. He let me start with the three paradigms of risk communication, and I got to squeeze in a few minutes halfway through on crisis communication using the Ebola epidemic in West Africa as an example.
But mostly Jeremy was interested in how farmers and farm industries should handle criticism, such as the recent attacks on the Australian wool industry by People for the Ethical Treatment of Animals PETA. The edited interview on the ABC website runs Data Breaches: Managing Reputational Impact. Email to David Gianatasio, March 4, with two March 16 emails interpolated.
My answers stressed the importance of addressing the concerns of affected stakeholders as opposed to the general public; and of focusing on negative reputation as opposed to positive reputation. The reputational impact of a data breach, I argued, depends mostly on two factors: how competently a company was protecting customer data before the breach, and how empathically it responded after the breach. Very little in my answers is unique to data breaches. Dave ended up focusing more on the specifics of the Target breach than on what companies should do about breaches, but he did find room in the last half of his March 23 article for several snippets from my answers.
The battle over fracking of shale gas and shale oil keeps getting hotter, in large part because the fracking industry does such a lousy job of communicating with its stakeholders. This column lays out some of the pluses mostly economic and minuses mostly environmental of fracking, and builds a case that the fracking industry should acknowledge the genuine minuses far more than it does, rather than focusing so much on selling the pluses of fracking and rebutting its not-so-genuine minuses.
But while it urges the fracking industry to become more trustworthy, the column puts more faith in accountability as a replacement for trust — accountability to neutral third parties, to governments that is, regulation , and especially to neighbors and activists.
The column ends with a list of eight additional recommendations for reducing stakeholder outrage about fracking. An illegal and potentially dangerous PCB storage facility had gone unnoticed for years until a leak brought it to official attention in March.
Monique wanted me to comment on the pros and cons of the decision to keep the information secret. Monique used a few quotes from the email in her story, along with some excellent quotes from my Canadian colleague Bill Leiss. After the Disaster: Communicating with the Public. Posted on the CropLife website , July 1, When companies believe that the environmental hazards of their activities are low, they tend to think that they can ignore or deflect public outrage.
Many do still underestimate the effect of public anger. This tends to make the public more angry and stokes the controversy further. Eventually pressure builds to the extent that the company cannot ignore public opinion any more and ends up going too far to reduce the hazard, perhaps spending millions on a piece of equipment that it had already decided was not necessary. Sandman has clearly identified the fact that public outrage can be as real as any other hazard and can have a direct effect on share price.
Now Sandman has distilled and repackaged his fire-fighting experience into a new reputation risk-management software program to help companies, institutions and government agencies to predict, quantify and manage public reaction to their plans. The program, called Outrage, is interactive and designed to be used in a workshop, tapping into operational, legal, public affairs and other areas of a company where an impact might be felt, building up a picture of external stakeholders and their likely views in specific situations.
The program is aimed at the oil and gas, mining, pharmaceuticals and chemicals industries, the insurance and financial sector and the regulatory sector, where the public has high expectations of performance. In the case of Shell and Brent Spar, Sandman believes Shell had consulted correctly with everyone it had to legally, but had done little with other interest groups such as the non-governmental agencies. In the case of having a wind farm in your backyard, there are many emotional, cultural, and personal considerations at play.
The reason a shift in the way we view risk is necessary is because when people are outraged they tend to think a hazard is more serious. Trying to convince them that the hazard is not serious is unlikely to do much good until steps are taken to reduce the outrage. The job of risk communication when the hazard is low and the outrage is high is to reduce the outrage.
The following advice is designed to help people cope empathetically with outraged stakeholders, where the stress is very high. Think about your projects or work, are there any examples where the above process could be applied to manage or reduce public outrage?
0コメント