1. Why Smart People Believe Strange Things

    (via Psychology – From Inquiry to Understanding)

    As Keith Stanovich (2009) observed, IQ tests do a good job of assessing how efficiently we process information, but they don’t assess the ability to think scientifically. For example, measures of confirmation bias, like the Wason selection task (see Chapter 2), are barely correlated, if at all, with IQ (Stanovich & West, 2008). Indeed, high levels of intelligence afford no guarantee against beliefs for which there’s scant evidence (Hyman, 2002). People with high IQs are at least as prone as other people to beliefs in conspiracy theories, such as the belief that President Kennedy’s assassination was the result of a coordinated plot within the U.S. government (Goertzel, 1994) or that the Bush administration orchestrated the September 11 attacks (Molé, 2006). Moreover, the history ofscience is replete with examples of brilliant individuals holding strange beliefs. Two-time Nobel Prize–winning chemist Linus Pauling insisted that high levels of vitamin C can cure cancer, despite overwhelming evidence to the contrary.

    In many cases, smart people embrace odd beliefs because they’re adept at finding plausible-sounding reasons to bolster their opinions (Shermer, 2002). IQ is correlated positively with the ability to defend our positions effectively, but correlated negatively with the ability to consider alternative positions (Perkins, 1981). High IQ may be related to the strength of the ideological immune system: our defenses against evidence that contradicts our views (Shermer, 2002; Snelson, 1993). We’ve all felt our ideological immune systems kicking into high gear when a friend challenges our political beliefs (say, about capital punishment) with evidence we’d prefer not to hear. First we first feel defensive, and then we frantically search our mental knowledge banks to find arguments that could refute our friend’s irksome evidence. Our knack for defending our positions against competing viewpoints can sometimes lead to confirmation bias, blinding us to information we should take seriously.

    Robert Sternberg (2002) suggested that people with high IQs are especially vulnerable to the sense of omniscience (knowing everything). Because intelligent people know many things, they frequently make the mistake of thinking they know just about everything. For example, the brilliant writer Sir Arthur Conan Doyle, who invented the character Sherlock Holmes, got taken in by an embarrassingly obvious photographic prank (Hines, 2003). In the 1917 “Cottingley fairies” hoax, two young British girls insisted that they’d photographed themselves along with dancing fairies. Brushing aside the criticisms of doubters, Conan Doyle wrote a book about the Cottingley fairies and defended the girls against accusations of trickery. He’d forgotten the basic principle that extraordinary claims require extraordinary evidence. The girls eventually confessed to doctoring the photographs after someone discovered they’d cut the fairies out of a book (Randi, 1982). Conan Doyle, who had a remarkably sharp mind, may have assumed that he couldn’t be duped. Yet none of us is immune from errors in thinking. When intelligent people neglect the safeguards afforded by the scientific method, they’ll often be fooled.

  2. The Belief in a Just World: A Fundamental Delusion

    xxzulaxx:

    Does what goes around come around? Do you get what’s coming to you? Do you reap what you sow?
    ______________________________________________________

    Children are often heard to whine to their parents: “But that’s not fair!” and their agitated parents reply: “Tough, life’s not fair.”

    With age you hear people express less and less surprise at life’s unfairness. We still whine about it, but we’re less surprised.

    Still, there’s some part of us that likes to believe the world should be fair. Psychologists call this kernel of teenage righteousness ‘the just-world hypothesis’. Here it is stated by Lerner and Miller (1978):

    “Individuals have a need to believe that they live in a world where people generally get what they deserve”

    This simple statement has all sorts of strange effects. Here’s a depressing one from Hafer and Begue (2005):

    “A woman is raped by a stranger who sneaks into her apartment while she takes out the garbage […] The rape victim described how several people (even one close friend) suggested that she was partly to blame, in one case because of her “negative attitude” that might have ‘attracted’ more ‘negativity’; in another, by choosing to live in that particular neighborhood.” (referring to: After Silence: Rape & My Journey Back)

    Clearly these are terrible, terrible judgements to make about someone who has been raped. But people still make these sorts of attributions in all sorts of situations. They think that ill people deserve their illness, that poor people deserve their poverty and so on.

    But why? What does the just-world belief do for people? Here’s what:

    “The belief that the world is just enables the individual to confront his physical and social environment as though they were stable and orderly. Without such a belief it would be difficult for the individual to commit himself to the pursuit of long-range goals or even to the socially regulated behavior of day-to-day life.” (Lerner & Miller, 1978)

    We naturally vary in the amount we believe in the just-world hypothesis, so not all of us are under the same delusion. But the bias does help to explain why some people continue to attribute blame where there is none.

    Image credit: ilkin

    (Source)

  3. A friend asked me to put some common fallacies in context. Here goes:

    themurderofcrows:

    Anchoring- ‘When I’ve had cabbage on food before, it always tasted good. Therefore, if I always get food with cabbage on it, it will always taste good.’

    Gambler’s Fallacy- ‘Every time I’ve bought a cabbage, it’s always been fresh. So the next cabbage I bought will HAVE to be a bad cabbage, even though the chance of getting a fresh or bad cabbage is 50/50.’

    Reactivity - ‘I know my mother-in-law hates cabbage, so when she comes over, I’ll make it look like I hate cabbage.’

    Self-fulfilling Prophecy- ‘God, Maji made me this stupid cabbage soup for dinner, but I knew she was a horrible cook, so I knew the soup was going to taste horrible. And what do you know, it did, even though everybody else liked it!’

    Halo effect- ‘Maji likes cabbage, so obviously she eats healthy. I bet she exercises a lot too!’

    Mob mentality- ‘Oh man, it seems like everybody’s eating cabbage these days. I should probably have some too.’

    Reactance- ‘Maji’s such a bitch! She told me to eat all this cabbage, but I’m just gonna sit here and not eat any of it! That’ll show her!’

    Confirmation bias- ‘The last two times I ate cabbage, I found money in the laundry. I bet if I keep eating cabbage, I’ll keep finding money in the laundry! Screw further research!’

    Planning fallacy- ‘Yeah, I can totally eat this cabbage in five minutes… oh man, has it been an hour already?’

    Just-World Phenomenon- ‘Are you saying that bimbo Stacy choked on a cabbage last night at the party? Uggh, that slutknewshe couldn’t eat a whole cabbage… she totally deserved it.’

    Self-Serving Bias- ‘When I make a really good cabbage soup, I bet it’s because I’m so damn good at cooking. When it’s bad, though, it’s just because the cabbage wasn’t fresh, or the stove is broken.’

    Bias blind spot - ‘Yeah, I’m a perfectly sane human being. Now eat my cabbage soup: it’s fantastic. Freshest ingredients possible; after all, I’ve been having nothing but spoiled stuff for the past couple of weeks, so according to the odds…’

    (Source: darktypes, via psychcomedy)

  4. The psychology of gullibility

    Cutting down the dissonance: the psychology of gullibility

    Two years ago, 14-year-old Nathan Zohner, a student at Eagle Rock Junior High in Idaho Falls, announced on the Internet that he had circulated a petition demanding strict control of a chemical known as dihydrogen monoxide. This substance, he wrote, caused excessive sweating and vomiting, can be lethal if accidentally inhaled, contributes to erosion, and has been found in tumors of cancer patients. The student asked 50 people whether they supported the ban. Forty-three said yes, six were undecided, and only one knew that dihydrogen monoxide was… water.

    While embracing a ban on H2O seems more foolish than dangerous, this anecdote shows how quickly people embrace some kinds of ideas without subjecting them to critical scrutiny. The human propensity to accept ideas at face value—no matter how illogical—is the fertile soil in which pseudoscience grows. Beliefs in UFOs, astrology, extrasensory perception, palm reading, crystal therapy, or guardian angels do not meet scientific criteria for rational plausibility (such as experimental reproducibility or Karl Popper’s idea of falsifiability) and generally rely on anecdotes instead of hard evidence for support, though they may partake of scientific-sounding terms or rationales; all such concepts can be safely described as pseudoscience. Why do people embrace irrational belief systems even after repeated disconfirmation by scientists?

    It is easy to dismiss these ideas as amusing and eccentric, but in some situations they pose concrete dangers to individuals; they occasionally even affect society. Former First Lady Nancy Reagan revealed in her autobiography that she employed a psychic for seven years to schedule dates for important meetings; more recently, Hillary Rodham Clinton admitted to having imaginary conversations with Eleanor Roosevelt on the advice of New Age guru Jean Houston. These public figures are hardly alone in seeking answers from the stars and soothsayers; the persistence and popularity of such beliefs reflect the many perceived benefits in pseudoscience. Psychologists agree that all belief systems — astrology, Objectivism, religion — ease anxiety about the human condition, and provide the illusion of security, predictability, control, and hope in an otherwise chaotic world.

    Scott Lilienfeld, assistant professor of psychology at Emory University and consulting editor at the Skeptical Inquirer, identifies two major catalysts for the prevalence of pseudoscientific beliefs: the information explosion (often a misinformation explosion) and the low level of scientific literacy in the general population. He cites poll data indicating that only 7 percent of the population can answer basic scientific questions like “What is DNA?” or “What is a molecule?” And when science cannot provide answers, or when people refuse to accept a scientific explanation (such as when fertility treatments don’t work), pseudoscience often provides highly individualized explanations. “People believe in things like astrology because it works for them better than anything else,” says Herbert Gans, the Robert S. Lynd professor of sociology at Columbia. “Your own system is the most efficient one, whether it’s a guardian angel, a rabbit’s foot, or a God watching over you. And if it doesn’t work, there’s always an excuse for it.”

    Another reason people find pseudoscience plausible is a cognitive ability to “see” relationships that don’t exist. “We have an adaptive reflex to make sense of the world, and there is a strong motivation to do this,” says Lilienfeld. “We need this ability, because the world is such a complex and chaotic place, but sometimes it can backfire.” This outgrowth of our normal capacity for pattern recognition accounts for the "face on Mars" (a group of rocks that allegedly resembles a face) or the belief that a full moon causes an increase in the crime rate. When people believe in something strongly — whether it is an image on Mars or a causal interpretation of a chronological association — they are unlikely to let it go, even if it has been repeatedly discounted.

    In some cases, contradictory evidence can even strengthen the belief. As Leon Festinger and colleagues discussed in When Prophecy Fails, holding two contradictory beliefs leads to cognitive dissonance, a state few minds find tolerable. A believer may then selectively reinterpret data, reinforcing one of the beliefs regardless of the strength of the contradictory case. Festinger infiltrated a doomsday cult whose members were convinced the earth was going to blow up; when the date passed and the earth didn’t explode, the cult attributed the planet’s survival to the power of their prayers. “When people can’t reconcile scientific data with their own beliefs, they minimize one of them—science—and escape into mysticism, which is more reliable to them,” says Dr. Jeffrey Schaler, adjunct professor of psychology at American University.

    Belief systems tend to respond to challenges according to this pattern, says Lilienfeld. When researching a cherished belief or coming across information about it, a person may process the data as if wearing blinders, registering only the affirming information. The malleability of memory compounds this effect. “Once you have a belief, the way you look at evidence changes,” says Tory Higgins, chair of the psychology department at Columbia, whose research specialty is mechanisms of cognition. “When you search your memory, you are more likely to retrieve information that will support it and avoid exposure to information that will disconfirm it. If you fail to avoid it, you attack the validity and credibility of the source, or categorize it as an exception.”

    Dr. Robert Glick, head of the Columbia Center for Psychoanalytic Training and Research, calls belief systems “societal pain relievers.” “People will recruit anything from their environment that will ensure and protect their safety,” he says. “It gives you a sense that you’re not alone, and helps ease feelings of being powerless.” Power—whether an increase in a person’s perceived power or an abdication of it—is a major component of pseudoscience, and Glick explains people’s relations to power in Freudian terms. He describes belief systems as a metaphoric representation of our parents, providing a release from authority and responsibility. “People have a built-in predilection that wishes for assistance and support. This is an extension of childhood, where there were always people around us who control our life. Beliefs like astrology and even religion are a projection that there are forces in the heavens that are like your parents.”

    While it may be fun to read horoscopes in the newspaper, can real harm come from believing strongly in pseudoscience? Lilienfeld advises citizens to consider how pseudosciences pose concrete threats by weakening critical thinking and minimizing a person’s sense of control and responsibility. For individuals, this phenomenon can translate into thousands of dollars wasted on quack remedies—not to mention the medical danger to patients who forgo more reliable treatments. The risks extend to the societal level. “We need to be able to sift through the information overload we’re presented with each day and make sound judgments on everything from advertising to voting for politicians,” Lilienfeld says.


    Gans offers a more forgiving point of view. “If someone believes strongly in something like guardian angels, and they’re not in a mental hospital, and we haven’t provided a better answer, why not?” says Gans. “But if you’re just sitting inside your house all day and say, ‘Well, my guardian angel is going to take care of everything,’ then that’s bad.” And perhaps not too far from supporting a ban on dihydrogen monoxide.

  5. The Last is Liked Best

    If it’s the last, you’ll like it the best. That is the finding of a new study published in Psychological Science, a journal of the Association for Psychological Science.

    "Endings affect us in lots of ways, and one is this ‘positivity effect,’" says University of Michigan psychologist Ed O’Brien, who conducted the study with colleague Phoebe C. Ellsworth.

    Graduation from college, the last kiss before going off to war: we experience these “lasts” with deep pleasure and affection - in fact, more than we may have felt about those places or people the day before.

    Even long painful experiences that end pleasantly are rated more highly than short ones ending painfully. 

    But does the last-is-best bias obtain in everyday life, with insignificant events? It does, the study found. Moreover, says O’Brien, it doesn’t even have to be a real last one to be experienced as best. “When you simply tell people something is the last, they may like that thing more.”
    (More on the study)


    Why is this so? The authors have a few theories.

    Among these: “It’s something motivational,” says O’Brien. “You think: ‘I might as well reap the benefits of this experience even though it’s going to end,’ or ‘I want to get something good out of this while I still can.’”

    Another, says O’Brien: “Many experiences have happy endings - from the movies and shows we watch to dessert at the end of a meal - and so people may have a general expectation that things end well, which could bleed over into these insignificant or unrelated judgments.” 

    The findings of what O’Brien humbly calls “our little chocolate test” could have serious implications. Professors marking the last exam may give it the best grade even if it’s not objectively better than the preceding ones. Employers may be inclined to hire the last-interviewed job applicant. Awareness of this bias could make such subjective judgments fairer. 

    Of course, endings don’t bring up only positive emotions, O’Brien notes. Often there’s also sadness about loss - that bittersweet feeling.

    If its bittersweet chocolate and the last one you think you’ll eat, however, chances are the taste will be sweet. 

    "When The Last Is Best", Medical News Today

  6. Self-serving bias

    “We don’t see things as they are,” says a proverb. “We see things as we are.”

    Dawes (1990) proposes that this false consensus may occur because we generalize from a limited sample, which prominently includes ourselves. Lacking other information, why not “project” ourselves; why not impute our own knowledge to others and use our responses as a clue to their likely responses?

    Most people are in the majority; so when people assume they are in the majority they are usually right.

    Also, we’re more likely to spend time with people who share our attitudes and behaviors and, consequently, to judge the world from the people we know.

    On matters of ability or when we behave well or successfully, however, a false uniqueness effect more often occurs (Goethals & others, 1991). We serve our selfimage by seeing our talents and moral behaviors as relatively unusual. For example, those who use marijuana but use seat belts will overestimate (false consensus) the number of other marijuana users and underestimate (false uniqueness) the number of other seat belt users (Suls & others, 1988). Thus, we may see our failings as relatively normal and our virtues as relatively exceptional.

    To sum up, self-serving bias appears as self-serving attributions, self-congratulatory comparisons, illusory optimism, and false consensus for one’s failings.

    from: Social Psychology - David G. Myers

  7. Illusory Optimism…Ignorant Bliss will keep you Sane and Happy

    brainmtters:

    Your brain won’t allow you to believe the apocalypse could actually happen


    by Analee Newitz

    You may love stories about the end of the world, but that’s probably because, deep down, you don’t believe it could ever happen. But that’s not because you’re realistic. It’s actually a quirk of the human brain, recently explored by a group of neuroscientists, which prevents us from adjusting our expectations about the future — even if there’s good evidence that bad things are about to happen.

    A group of researchers from Germany and the UK designed a fairly complex psychological test to determine how people planned for negative events in the future. First, they asked the about the likelihood of 80 different disturbing events happening, such as contracting a fatal disease or being attacked. After they’d recorded people’s responses, researchers told each subject the actual, statistical likelihood of such events happening. In some cases, people had overestimated the likelihood and in some cases they’d underestimated it.

    Read More

    (Source: io9.com, via psychologybits)

  8. We seek what confirms our ideas and avoid what doesn’t

    Selective exposure theory

    This theory of communication posits that individuals prefer exposure to arguments supporting their position over those supporting other positions.

    As media consumers have more choices to expose themselves to selected medium and media contents with which they agree, they tend to select content that confirms their own ideas and avoid information that argues against their opinion.

    People don’t want to be told that they are wrong and they do not want their ideas to be challenged either. Therefore, they select different media outlets that agree with their opinions so they do not come in contact with this form of dissonance. Furthermore, these people will select the media sources that agree with their opinions and attitudes on different subjects and then only follow those programs.

  9. We think we understand others better than they understand us

    Illusion of Asymmetric Insight

    We commonly believe that we understand others better than they understand us.

    The rationale for this stems from our external, objective viewpoint and the assumption that the other person has a significant blind self, whilst our own blind self is small.

    There is also asymmetry in the reverse situation — we believe we understand ourselves better than others understand us and may feel insulted if they try to show they understand us more than we do.

    The same effect happens for groups, where the in-group believes they understand out-groups better than out-groups understand them.

    Overall, this is a position where we generally assume we know more than others, perhaps because we know more about what we know.

    Research:
    Pronin et al found that college roommates believed that they knew themselves better than their roommates knew themselves.

    Example:
    In an argument with another person you tell them what they are like in great detail because clearly they have very little self-knowledge. They argue back telling you things about yourself that are clearly wrong or that you knew anyway. How can people be so stupid?

    Using it:
    Be cautious about judging others and assumptions that they do not know themselves. When others try to read your mind, forgive them their foolishness. Do not be drawn into slanging matches.

    References: Pronin, Kruger, Savitsky and Ross (2001)Source

  10. Tendency to attribute others‘ behavior to enduring dispositions (e.g. attitudes, personality traits) because of both:
Underestimation of the influence of situational factors
Overestimation of the influence of dispositional factors

Possible explanations:
Behavior is more noticeable than situational factors
Insignificant weight is assigned to situational factors
People are cognitive misers
Richer trait-like language to explain behavior
Source

    Tendency to attribute others‘ behavior to enduring dispositions (e.g. attitudes, personality traits) because of both:

    • Underestimation of the influence of situational factors
    • Overestimation of the influence of dispositional factors

    image

    Possible explanations:

    • Behavior is more noticeable than situational factors
    • Insignificant weight is assigned to situational factors
    • People are cognitive misers
    • Richer trait-like language to explain behavior

    Source