© wikipedia
We like to think we're rational human beings.
In fact, we are prone to hundreds of proven biases that cause us to think and act irrationally, and even thinking we're rational despite evidence of irrationality in others is known as blind spot bias.
The study of
how often human beings do irrational things was enough for psychologists Daniel Kahneman to win the
Nobel Prize in Economics, and it opened the rapidly expanding field of behavioral economics. Similar insights are also reshaping everything from
marketing to
criminologyHoping to clue you - and ourselves - into the biases that frame our decisions, we've collected a long list of the most notable ones.
Affect heuristicThe way you feel filters the way you interpret the world.
Take, for instance, if the words rake, take, and cake flew across a computer screen blinked on a computer screen for 1/30 of a second.
Which would you recognize?
If you're hungry, research suggests that
all you see is cake.
Anchoring biasPeople are overreliant on the first piece of information they hear.
In a salary negotiation, for instance, whoever makes the first offer establishes a range of reasonable possibilities in each person's mind. Any counteroffer will naturally react to or be anchored by that opening offer.
"Most people come with the very strong belief they should never make an opening offer," says Leigh Thompson, a professor at Northwestern University's Kellogg School of Management. "Our research and lots of corroborating research shows that's completely backwards. The guy or gal who makes a first offer is better off."
Confirmation biasWe
tend to listen only to the information that confirms our preconceptions - one of the many reasons it's so hard to have an intelligent conversation about climate change.
Observer-expectancy effectA cousin of confirmation bias, here our
expectations unconsciously influence how we perceive an outcome. Researchers looking for a certain result in an experiment, for example, may inadvertently manipulate or interpret the results to reveal their expectations. That's why the "double-blind" experimental design was created for the field of scientific research.
Bandwagon effectThe probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink - and it's
a reason meetings are so unproductive.
Bias blind spotsFailing to recognize your cognitive biases is a bias in itself.
Notably, Princeton psychologist Emily Pronin has found that "
individuals see the existence and operation of cognitive and motivational biases much more in others than in themselves."
© flickr
Choice-supportive biasWhen you choose something, you tend to feel positive about it,
even if the choice has flaws. You think that your dog is awesome - even if it bites people every once in a while - and that other dogs are stupid, since they're not yours.
Clustering illusionThis is the tendency to see patterns in random events. It is central to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.
Conservatism biasWhere people believe prior evidence more than new evidence or information that has emerged. People were slow to accept the fact that the Earth was round because they maintained their earlier understanding the planet was flat.
ConformityThis is the tendency of people to conform with other people. It is so powerful that it may lead people to do ridiculous things, as shown by the following experiment by
Solomon Asch.
Ask one subject and several fake subjects (who are really working with the experimenter) which of lines B, C, D, and E is the same length as A? If all of the fake subjects say that D is the same length as A, the real subject will agree with this objectively false answer
a shocking three-quarters of the time.
"That we have found the tendency to conformity in our society so strong that reasonably intelligent and well-meaning young people are willing to call white black is a matter of concern,"
Asch wrote. "It raises questions about our ways of education and about the values that guide our conduct."
Curse of knowledgeWhen people who are more well-informed cannot understand the common man. For instance, in the TV show "The Big Bang Theory," it's difficult for scientist Sheldon Cooper to understand his waitress neighbor Penny.
Decoy effectA phenomenon in marketing where consumers have a specific change in preference between two choices after being presented with a third choice. Offer two sizes of soda and people may choose the smaller one; but offer a third even larger size, and people may choose what is now the medium option.
Denomination effectPeople are
less likely to spend large bills than their equivalent value in small bills or coins.
© Business Insider
Duration neglectWhen the duration of an event doesn't factor enough into the way we consider it. For instance,
we remember momentary pain just as strongly as long-term pain.
Availability heuristicWhen people overestimate the importance of information that is available to them.
For instance, a person might argue that smoking is not unhealthy on the basis that his grandfather lived to 100 and smoked three packs a day, an argument that ignores the possibility that his grandfather was an outlier.
Empathy gapWhere people in
one state of mind fail to understand people in another state of mind. If you are happy you can't imagine why people would be unhappy. When you are not sexually aroused, you can't understand how you act when you are sexually aroused.
Frequency illusionWhere a word, name or thing you just learned about
suddenly appears everywhere. Now that you know what that SAT word means, you see it in so many places!
Fundamental attribution errorThis is where you attribute a person's behavior to an intrinsic quality of her identity rather than the situation she's in. For instance, you might think your colleague is an angry person, when she is really just upset because she stubbed her toe.
Galatea EffectWhere people succeed - or underperform -
because they think they should.
Halo effectWhere we take one positive attribute of someone and
associate it with everything else about that person or thing.
Hard-Easy biasWhere everyone is
overconfident on easy problems and not confident enough for hard problems.
HerdingPeople tend to flock together,
especially in difficult or uncertain times.
© youtube
Hindsight biasA model poses with the new Nokia "E90 Communicator" phone during its launch in New Delhi June 28, 2007.
Of course Apple and Google would become the two most important companies in phones - tell that to Nokia, circa 2003.
Hyperbolic discountingThe tendency for people to
want an immediate payoff rather than a larger gain later on.
Ideometer effectWhere
an idea causes you to have an unconscious physical reaction, like a sad thought that makes your eyes tear up. This is also how Ouija boards seem to have minds of their own.
Illusion of controlThe tendency for people to
overestimate their ability to control events, like when a sports fan thinks his thoughts or actions had an effect on the game.
Information biasThe tendency
to seek information when it does not affect action. More information is not always better. Indeed, with less information,
people can often make more accurate predictions.
Inter-group biasWe view people
in our group differently from how see we someone in another group.
Irrational escalationWhen people make irrational decisions based on past rational decisions. It may happen in an auction, when a bidding war spurs two bidders to offer more than they would other be willing to pay.
Negativity biasThe tendency to put more emphasis on negative experiences rather than positive ones. People with this bias feel that "bad is stronger than good" and will perceive threats more than opportunities in a given situation.
Psychologists
argue it's an evolutionary adaptation - it's better to mistake a rock for a bear than a bear for a rock.
Omission biasThe tendency to prefer inaction to action, in ourselves and even in politics.
Psychologist Art Markman
gave a great example back in 2010:
The omission bias creeps into our judgment calls on domestic arguments, work mishaps, and even national policy discussions. In March, President Obama pushed Congress to enact sweeping health care reforms. Republicans hope that voters will blame Democrats for any problems that arise after the law is enacted. But since there were problems with health care already, can they really expect that future outcomes will be blamed on Democrats, who passed new laws, rather than Republicans, who opposed them? Yes, they can - the omission bias is on their side.Ostrich effectThe decision to ignore dangerous or negative information by "burying" one's head in the sand,
like an ostrich.
© Business Insider
Outcome biasJudging a decision based on the outcome -
rather than how exactly the decision was made in the moment. Just because you won a lot at Vegas, doesn't mean gambling your money was a smart decision.
OverconfidenceSome of us are
too confident about our abilities, and this causes us to take greater risks in our daily lives.
OveroptimismWhen we believe the world is a better place than it is,
we aren't prepared for the danger and violence we may encounter. The inability to accept the full breadth of human nature leaves us vulnerable.
Pessimism biasThis is the opposite of the overoptimism bias.
Pessimists over-weigh negative consequences with their own and others' actions.
Placebo effectWhere believing that something is happening helps cause it to happen. This is a basic principle of stock market cycles, as well as a supporting feature of medical treatment in general.
Planning fallacyThe tendency to underestimate
how much time it will take to complete a task.
Post-purchase rationalizationMaking ourselves believe that
a purchase was worth the value after the fact.
PrimingPriming is where if you're introduced to an idea, you'll more readily identify related ideas.
Let's take an experiment as an example, again from
Less Wrong:
Suppose you ask subjects to press one button if a string of letters forms a word, and another button if the string does not form a word. (E.g., "banack" vs. "banner".) Then you show them the string "water". Later, they will more quickly identify the string "drink" as a word. This is known as "cognitive priming"Priming also reveals the massive parallelism of spreading activation: if seeing "water" activates the word "drink", it probably also activates "river", or "cup", or "splash"
Pro-innovation biasWhen a proponent of an innovation
tends to overvalue its usefulness and undervalue its limitations. Sound familiar, Silicon Valley?
ProcrastinationDeciding to act in favor of the
present moment over investing in the future.
ReactanceThe desire to
do the opposite of what someone wants you to do, in order to prove your freedom of choice.
RecencyThe tendency to weigh the
latest information more heavily than older data.
ReciprocityThe belief that
fairness should trump other values, even when it's not in our economic or other interests.
Regression biasPeople take action in response to extreme situations. Then when the situations become less extreme, they take credit for causing the change,
when a more likely explanation is that the situation was reverting to the mean.
Restraint biasOverestimating one's
ability to show restraint in the face of temptation.
© flickr
SalienceOur tendency to focus on the
most easily-recognizable features of a person or concept.
Scope insensitivityThis is where your willingness to pay for something doesn't correlate with the scale of the outcome.
From
Less Wrong:
Once upon a time, three groups of subjects were asked how much they would pay to save 2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered $80, $78, and $88. This is scope insensitivity or scope neglect: the number of birds saved - the scope of the altruistic action - had little effect on willingness to pay.Seersucker IllusionOver-reliance on expert advice. This has to do with the avoidance or responsibility. We call in "experts" to forecast when typically they have no greater chance of predicting an outcome than the rest of the population. In other words, "
for every seer there's a sucker."
Selective perceptionAllowing our expectations
to influence how we perceive the world.
Self-enhancing transmission biasEveryone
shares their successes more than their failures. This leads to a false perception of reality and inability to accurately assess situations.
Status quo biasThe tendency
to prefer things to stay the same. This is similar to loss-aversion bias, where people prefer to avoid losses instead of acquiring gains.
StereotypingExpecting a group or person to have certain qualities
without having real information about the individual. This explains the snap judgments
Malcolm Gladwell refers to in "Blink." While there may be some value to stereotyping, people tend to overuse it.
Survivorship biasAn error that comes from focusing only on surviving examples,
causing us to misjudge a situation. For instance, we might think that being an entrepreneur is easy because we haven't heard of all of the entrepreneurs who have failed.
It can also cause us to assume that survivors are inordinately better than failures, without regard for the importance of luck or other factors.
Tragedy of the commonsWe overuse common resources because it's not in any individual's interest to conserve them. This explains the overuse of natural resources, opportunism, and any acts of self-interest over collective interest.
© Business Insider
Unit biasWe believe that there is an optimal unit size, or a universally-acknowledged amount of a given item that is perceived as appropriate. This explains why when served larger portions, we eat more.
Zero-risk biasThe preference to
reduce a small risk to zero versus achieving a greater reduction in a greater risk.
This plays to our desire to have complete control over a single, more minor outcome, over the desire for more - but not complete - control over a greater, more unpredictable outcome.
A lot of people should reread this regularly for a few months while trying to be aware of their own actions.
People make incredible amounts of bad decisions and do stupid things because of these biases.
We could also bring up the Milgram Experiment - people are more likely to do something stupid/wrong/bad when a figure of authority tells them to.
"The tendency for people to want an immediate payoff rather than a larger gain later on."
I don't get why people would do this. It's so obviously not the winning choice. But this shortsightedness, aim for instant gratification, impatience, and lack of strategy is why most people I know have less money than me even though their income is 2-4 times higher.
"We overuse common resources because it's not in any individual's interest to conserve them."
What I find mind-boggling is how people do this knowingly and how it's impossible to explain to them that this attitude is why our world is in such shit. I tend to conserve and save things whether they're mine or not, and people's reactions are very often something like, "Why do you do that? It's not you who's paying for it." I feel like punching them in the face because explanations just aren't making it through.
"This is also how Ouija boards seem to have minds of their own."
What are you trying to say? Is this some kind of bias against Ouija boards? On this website of all places...