Lapidarium notes RSS

Amira Skomorowska's notes

"Everything you can imagine is real."— Pablo Picasso

Lapidarium

Tags:

Africa
Age of information
Ancient
Anthropology
Art
Artificial intelligence
Astronomy
Atheism
Beauty
Biography
Books
China
Christianity
Civilization
Cognition, perception, relativity
Cognitive science
Collective intelligence
Communication
Consciousness
Creativity
Culture
Curiosity
Cyberspace
Democracy
Documentary
Drawing
Earth
Economy
Evolution
Friendship
Funny
Future
Genetics
Globalization
Happiness
History
Human being
Illustrations
Imagination
Individualism
Infographics
Information
Inspiration
Internet
Knowledge
Language
Learning
Life
Literature
Logic
Love
Mathematics
Media
Metaphor
Mind & Brain
Multiculturalism
Music
Networks
Neuroscience
Painting
Paradoxes
Patterns
Philosophy
Poetry
Politics
Physics
Psychology
Rationalism
Religions
Science
Science & Art
Self improvement
Semantics
Society
Sociology
Storytelling
Technology
The other
Time
Timeline
Traveling
Unconsciousness
Universe
USA
Video
Violence
Visualization


Homepage
Twitter
Facebook

A Box Of Stories
Reading Space

Contact

Archive

Jan
13th
Fri
permalink

A risk-perception: What You Don’t Know Can Kill You

"Humans have a perplexing 
tendency to fear rare threats such as shark attacks while blithely 
ignoring far greater risks like 
unsafe sex and an unhealthy diet. Those illusions are not just 
silly—they make the world a more dangerous place. (…)

We like to think that humans are supremely logical, making decisions on the basis of hard data and not on whim. For a good part of the 19th and 20th centuries, economists and social scientists assumed this was true too. The public, they believed, would make rational decisions if only it had the right pie chart or statistical table. But in the late 1960s and early 1970s, that vision of homo economicus—a person who acts in his or her best interest when given accurate information—was knee­capped by researchers investigating the emerging field of risk perception. What they found, and what they have continued teasing out since the early 1970s, is that humans have a hell of a time accurately gauging risk. Not only do we have two different systems—logic and instinct, or the head and the gut—that sometimes give us conflicting advice, but we are also at the mercy of deep-seated emotional associations and mental shortcuts. (…)

Our hardwired gut reactions developed in a world full of hungry beasts and warring clans, where they served important functions. Letting the amygdala (part of the brain’s emotional core) take over at the first sign of danger, milliseconds before the neocortex (the thinking part of the brain) was aware a spear was headed for our chest, was probably a very useful adaptation. Even today those nano-pauses and gut responses save us from getting flattened by buses or dropping a brick on our toes. But in a world where risks are presented in parts-per-billion statistics or as clicks on a Geiger counter, our amygdala is out of its depth.

A risk-perception apparatus permanently tuned for avoiding mountain lions makes it unlikely that we will ever run screaming from a plate of fatty mac ’n’ cheese. “People are likely to react with little fear to certain types of objectively dangerous risk that evolution has not prepared them for, such as guns, hamburgers, automobiles, smoking, and unsafe sex, even when they recognize the threat at a cognitive level,” says Carnegie Mellon University researcher George Loewenstein, whose seminal 2001 paper, “Risk as Feelings,” debunked theories that decision making in the face of risk or uncertainty relies largely on reason. “Types of stimuli that people are evolutionarily prepared to fear, such as caged spiders, snakes, or heights, evoke a visceral response even when, at a cognitive level, they are recognized to be harmless,” he says. Even Charles Darwin failed to break the amygdala’s iron grip on risk perception. As an experiment, he placed his face up against the puff adder enclosure at the London Zoo and tried to keep himself from flinching when the snake struck the plate glass. He failed.

The result is that we focus on the one-in-a-million bogeyman while virtually ignoring the true risks that inhabit our world. News coverage of a shark attack can clear beaches all over the country, even though sharks kill a grand total of about one American annually, on average. That is less than the death count from cattle, which gore or stomp 20 Americans per year. Drowning, on the other hand, takes 3,400 lives a year, without a single frenzied call for mandatory life vests to stop the carnage. A whole industry has boomed around conquering the fear of flying, but while we down beta-blockers in coach, praying not to be one of the 48 average annual airline casualties, we typically give little thought to driving to the grocery store, even though there are more than 30,000 automobile fatalities each year.

In short, our risk perception is often at direct odds with reality. All those people bidding up the cost of iodide? They would have been better off spending $10 on a radon testing kit. The colorless, odorless, radioactive gas, which forms as a by-product of natural uranium decay in rocks, builds up in homes, causing lung cancer. According to the Environmental Protection Agency, radon exposure kills 21,000 Americans annually.

David Ropeik, a consultant in risk communication and the author of How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts, has dubbed this disconnect the perception gap. “Even perfect information perfectly provided that addresses people’s concerns will not convince everyone that vaccines don’t cause autism, or that global warming is real, or that fluoride in the drinking water is not a Commie plot,” he says. “Risk communication can’t totally close the perception gap, the difference between our fears and the facts.”

In the early 1970s, psychologists Daniel Kahneman, now at Princeton University, and Amos Tversky, who passed away in 1996, began investigating the way people make decisions, identifying a number of biases and mental shortcuts, or heuristics, on which the brain relies to make choices. Later, Paul Slovic and his colleagues Baruch Fischhoff, now a professor of social sciences at Carnegie Mellon University, and psychologist Sarah Lichtenstein began investigating how these leaps of logic come into play when people face risk. They developed a tool, called the psychometric paradigm, that describes all the little tricks our brain uses when staring down a bear or deciding to finish the 18th hole in a lighting storm.

Many of our personal biases are unsurprising. For instance, the optimism bias gives us a rosier view of the future than current facts might suggest. We assume we will be richer 10 years from now, so it is fine to blow our savings on a boat—we’ll pay it off then. Confirmation bias leads us to prefer information that backs up our current opinions and feelings and to discount information contradictory to those opinions. We also have tendencies to conform our opinions to those of the groups we identify with, to fear man-made risks more than we fear natural ones, and to believe that events causing dread—the technical term for risks that could result in particularly painful or gruesome deaths, like plane crashes and radiation burns—are inherently more risky than other events.

But it is heuristics—the subtle mental strategies that often give rise to such biases—that do much of the heavy lifting in risk perception. The “availability” heuristic says that the easier a scenario is to conjure, the more common it must be. It is easy to imagine a tornado ripping through a house; that is a scene we see every spring on the news, and all the time on reality TV and in movies. Now try imagining someone dying of heart disease. You probably cannot conjure many breaking-news images for that one, and the drawn-out process of athero­sclerosis will most likely never be the subject of a summer thriller. The effect? Twisters feel like an immediate threat, although we have only a 1-in-46,000 chance of being killed by a cataclysmic storm. Even a terrible tornado season like the one last spring typically yields fewer than 500 tornado fatalities. Heart disease, on the other hand, which eventually kills 1 in every 6 people in this country, and 800,000 annually, hardly even rates with our gut. (…)

All the mental rules of thumb and biases banging around in our brain, the most influential in assessing risk is the “affect” heuristic. Slovic calls affect a “faint whisper of emotion” that creeps into our decisions. Simply put, positive feelings associated with a choice tend to make us think it has more benefits. Negative correlations make us think an action is riskier. One study by Slovic showed that when people decide to start smoking despite years of exposure to antismoking campaigns, they hardly ever think about the risks. Instead, it’s all about the short-term “hedonic” pleasure. The good outweighs the bad, which they never fully expect to experience.

Our fixation on illusory threats at the expense of real ones influences more than just our personal lifestyle choices. Public policy and mass action are also at stake. The Office of National Drug Control Policy reports that prescription drug overdoses have killed more people than crack and heroin combined did in the 1970s and 1980s. Law enforcement and the media were obsessed with crack, yet it was only recently that prescription drug abuse merited even an after-school special.

Despite the many obviously irrational ways we behave, social scientists have only just begun to systematically document and understand this central aspect of our nature. In the 1960s and 1970s, many still clung to the homo economicus model. They argued that releasing detailed information about nuclear power and pesticides would convince the public that these industries were safe. But the information drop was an epic backfire and helped spawn opposition groups that exist to this day. Part of the resistance stemmed from a reasonable mistrust of industry spin. Horrific incidents like those at Love Canal and Three Mile Island did not help. Yet one of the biggest obstacles was that industry tried to frame risk purely in terms of data, without addressing the fear that is an instinctual reaction to their technologies.

The strategy persists even today. In the aftermath of Japan’s nuclear crisis, many nuclear-energy boosters were quick to cite a study commissioned by the Boston-based nonprofit Clean Air Task Force. The study showed that pollution from coal plants is responsible for 13,000 premature deaths and 20,000 heart attacks in the United States each year, while nuclear power has never been implicated in a single death in this country. True as that may be, numbers alone cannot explain away the cold dread caused by the specter of radiation. Just think of all those alarming images of workers clad in radiation suits waving Geiger counters over the anxious citizens 
of Japan. Seaweed, anyone? (…)

All that media created a sort of feedback loop. Because people were seeing so many sharks on television and reading about them, the “availability” heuristic was screaming at them that sharks were an imminent threat.

“Certainly anytime we have a situation like that where there’s such overwhelming media attention, it’s going to leave a memory in the population,” says George Burgess, curator of the International Shark Attack File at the Florida Museum of Natural History, who fielded 30 to 40 media calls a day that summer. “Perception problems have always been there with sharks, and there’s a continued media interest in vilifying them. It makes a situation where the risk perceptions of the populace have to be continually worked on to break down stereotypes. Anytime there’s a big shark event, you take a couple steps backward, which requires scientists and conservationists to get the real word out.”

Then again, getting out the real word comes with its own risks—like the risk of getting the real word wrong. Misinformation is especially toxic to risk perception because it can reinforce generalized confirmation biases and erode public trust in scientific data. As scientists studying the societal impact of the Chernobyl meltdown have learned, doubt is difficult to undo. In 2006, 20 years after reactor number 4 at the Chernobyl nuclear power plant was encased in cement, the World Health Organization (WHO) and the International Atomic Energy Agency released a report compiled by a panel of 100 scientists on the long-term health effects of the level 7 nuclear disaster and future risks for those exposed. Among the 600,000 recovery workers and local residents who received a significant dose of radiation, the WHO estimates that up to 4,000 of them, or 0.7 percent, will develop a fatal cancer related to Chernobyl. For the 5 million people living in less contaminated areas of Ukraine, Russia, and Belarus, radiation from the meltdown is expected to increase cancer rates less than 1 percent. (…)

During the year following the September 11 attacks, millions of Americans opted out of air travel and slipped behind the wheel instead. While they crisscrossed the country, listening to breathless news coverage of anthrax attacks, extremists, and Homeland Security, they faced a much more concrete risk. All those extra cars on the road increased traffic fatalities by nearly 1,600. Airlines, on the other hand, recorded no fatalities.

It is unlikely that our intellect can ever paper over our gut reactions to risk. But a fuller understanding of the science is beginning to percolate into society. Earlier this year, David Ropeik and others hosted a conference on risk in Washington, D.C., bringing together scientists, policy makers, and others to discuss how risk perception and communication impact society. “Risk perception is not emotion and reason, or facts and feelings. It’s both, inescapably, down at the very wiring of our brain,” says Ropeik. “We can’t undo this. What I heard at that meeting was people beginning to accept this and to realize that society needs to think more holistically about what risk means.”

Ropeik says policy makers need to stop issuing reams of statistics and start making policies that manipulate our risk perception system instead of trying to reason with it. Cass Sunstein, a Harvard law professor who is now the administrator of the White House Office of Information and Regulatory Affairs, suggests a few ways to do this in his book Nudge: Improving Decisions About Health, Wealth, and Happiness, published in 2008. He points to the organ donor crisis in which thousands of people die each year because others are too fearful or uncertain to donate organs. People tend to believe that doctors won’t work as hard to save them, or that they won’t be able to have an open-
casket funeral (both false). And the gory mental images of organs being harvested from a body give a definite negative affect to the exchange. As a result, too few people focus on the lives that could be saved. Sunstein suggests—controversially—“mandated choice,” in which people must check “yes” or “no” to organ donation on their driver’s license application. Those with strong feelings can decline. Some lawmakers propose going one step further and presuming that people want to donate their organs unless they opt out.

In the end, Sunstein argues, by normalizing organ donation as a routine medical practice instead of a rare, important, and gruesome event, the policy would short-circuit our fear reactions and nudge us toward a positive societal goal. It is this type of policy that Ropeik is trying to get the administration to think about, and that is the next step in risk perception and risk communication. “Our risk perception is flawed enough to create harm,” he says, “but it’s something society can do something about.””

Jason Daley, What You Don’t Know Can Kill You, Discover Magazine, Oct 3, 2011. (Illustration: SteveCarroll, The Economist)

See also:

Daniel Kahneman on thinking ‘Fast And Slow’: How We Aren’t Made For Making Decisions
Daniel Kahneman: The Marvels and the Flaws of Intuitive Thinking
Dean Buonomano on ‘Brain Bugs’ - Cognitive Flaws That ‘Shape Our Lives’
Daniel Kahneman: How cognitive illusions blind us to reason, The Observer, 30 October 2011 
Daniel Kahneman on the riddle of experience vs. memory
The irrational mind - David Brooks on the role of emotions in politics, policy, and life