Tversky & Kahneman (1974) emphasized
use of heuristics to make up for lack of
1.1 Strategies that can be applied
easily to a wide variety of
situations and often lead to
1.1.1 They provide plausible conjectures, but not irrefutable conclusions.
1.2 Availablity Huristic.
1.2.1 Decisions based on the most "available" Memories
Judgments based on ease with which relevant instances can be retrieved from memory.
– E.g., Estimate in 7 seconds how many flowers, or Russian novelists you could name in two minutes.
1.2.2 Preference for recent anecdotal evidence
1.2.3 Is the letter 'r' more commonly the first or the third letter in words?
1.2.4 Media coverage makes certain causes of
death seem more likely then others
1.2.5 Generally we are less confident of decisions when asked
to produce more arguments in support (Bless & Pham,
1.3 Representativeness Heuristic
1.3.1 If something or someone appears to fit a category, you will use
what you know about that that category to make judgments.
126.96.36.199 coin flips and subset judgements
1.3.2 We should take statistics seriously
We aren’t good with probabilities.
– Overconfidence takes over and we tend to think we can
beat the odds
– “statistics happen to other people.”
In risky financial markets this can get people into a lot of trouble.
E.g., most people lose their money in futures markets
– but the spectacular profits that can be gained draw in people who believe they will be the ones to win.
2.1.1 Confidence in decisions
climbs as more information
is obtained, even if
information is dubious
This bias greater in more difficult tasks.
– Estimating our potential productivity.
– “I can do the assigned paper in 3 hours, no problem”
However, an under-confidence bias may be even more problematic.
– May never make any decisions.
2.2 Loss Aversive
2.2.1 We weigh prospect of losses
more heavily. Sell gains, hold
2.2.2 Kynamen called this Prospect theory
2.2.3 The Endowment Effect
188.8.131.52 Place higher value on what’s mine. – Bias may be
adaptive because losses could threaten survival.
2.3 Framing of the Problem
2.3.1 Framing Effects
184.108.40.206 We judge choices by comparing them to
others in the same category
220.127.116.11.1 Marketers often use products that no one wants
18.104.22.168 We tend to ignore base
rates, even when stated
Implications of analysis
Testing the whole population for HIV may kill more people than it saves.
Should you get a full-body scan that can randomly look for many diseases?
– Initial diagnosis effectively raises base-rate, thus makes specific tests more accurate.
Should we develop nation-wide databases for fingerprints and DNA?
– Only if we understand limitations.
– E.g., Man from US state of Oregon whose fingerprints matched some in Madrid after 2004 train bombing.
2.3.2 Favor guaranteed option when framed as a
gain, risky option when framed as loss.
2.3.3 Framing effects important decisions, like
organ donation. opt in v opt out.
22.214.171.124 Evidence of the effect of un important information
2.4 influence of faulty information
2.5 Distortions in Judgements
2.6 Status Quo Bias
2.6.1 may be maintained by Loss Aversion
3.1 We have limited memory, cognitive capacity, and time, so make
the best decisions we can rather best that are possible.
3.1.1 We pick-up a lot of valid information from environment
4 Problems with Expected Utility theory
Often doesn’t fit to empirical data. – Leads to various paradoxes.
– “Sunk cost” fallacy
Probabilities and utilities may be subjective, based on our own experience.
– Could represent individual beliefs
– Savage (1954) developed subjective expected
Can think of expected utility theory as a normative
– what people should do, given certain assumptions.
4.1 Often doesn’t fit to empirical data
4.2 Probabilities and utilities
may be subjective,
based on our own
4.3 Can think of expected
utility theory as a