ECO474 Behavioral Economics: Topics (with short discussions and linked works)

What follows is a list of behavioral topics (updated regularly) with studies, experiments, blogs, and related videos attached. Note, this information may be duplicative in this site. Here is Wikipedia's List of Cognitive Biases. I will explore some of these below in more detail with links to relevant studies. This is basically a nice place to store everything, so you don't have to remember which topic was discussed during what week.

Books in the Popular Media Covering a Wide Range of BE Topics

Ariely, Dan. 2008. Predicatably Irrational: The Hidden Forces that Shape Our Decisions. Harper Collins.

_____. 2010. The Upside of Irrationality: The Unexpected Benefits of Defying Logic at Work and at Home. Harper Collins.

_____. 2012. The (Honest) Truth About Dishonesty: How We Lie to Everyone, Especially Ourselves. Harper Collins.

Kahneman, Daniel. 2012. Thinking Fast and Slow. Farrar, Straus and Giroux, New York.

The Chronicle of Higher Education on Kahneman's Influence 

Surowiecki, James. 2004. The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations.

Taleb, Nassim Nicholas. 2007. The Black Swan: The Impact of the Highly Improbable. Random House.

Thaler, Richard and Cass Sunstein. 2008. Nudge: Improving Decisions about Health, Wealth, and Happiness. Yale University Press.

INDIVIDUAL DECISION MAKING

Rationality

Less Wrong Blog

Preference Reversal

Neuroscience, Cognitive Psychology

A page devoted to the mind

Our Fast and Slow Brains: The Automatic System versus the Reflective System

These systems are discussed at length in both Nudge and Thinking Fast and Slow. They relate to loads of stuff in Behavioral Economics including Evolution and Individual Decision Making -- Heuristics etc. See also Self Control as it relates to our ability to use our "slow brain" or System 2 or Reflective System.

Pupil Research: Kahneman and a graduate student of his (Jackson Beatty) did some experiments based on an article by Eckart Hess in Scientific American in which Hess found that pupil dilation was sensitive to mental effort. Here's a paper they did on PUPIL DILATION MEASURES IN CONSUMER RESEARCH: APPLICATIONS AND LIMITATIONS. What they were getting at was really the basis of humans having two "brain systems" -- what Kahneman calls System 1 and System 2. System 1 is our automatic system and system 2 is the system used when we are addressing more difficult problems. But the deal is that we are lazy. "A general 'law of least effort' applies to cognitive as well as physical exertion. The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action." As we become better at something, as skill increases, fewer brain regions are involved in a task.

Evolution as an Explanation to our Actions and Beliefs

Paul Bloom, in an Atlantic Monthly article from 2005 talks about the evolutionary forces that shape human's two concepts of causality. He presents an argument that "our inborn readiness to separate physical and intentional causality explains the near universality of religious beliefs" (Kahneman 2012).

Heuristics & Biases: According to Daniel Kahneman, "when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution."

Gigerenzer's work on "fast and frugal" heursitics

Anchoring: "The anchoring effect occurs when people consider a particular value for an unknown quantity before estimating that quantity. What happens is one of the most reliable and robust results of experimental psychology: the estimates stay close to the number that people considered -- hence the image of an anchor" (Kahneman, 2012).

Two mechanisms to produce anchoring:

Anchoring and Adjustment: A "strategy for estimating uncertain quantities: start from an anchoring number, assess whether it is too high or too low, and gradually adjust your estimate by mentally "moving" from the anchor. The adjustment typically ends prematurely" (Kahneman 2012).

For example, see papers by Robyn A. LeBoeuf and Eldar Shafir, such as "The long and short of it: physical anchoring effects" 2006 in Behavioral Decision Making, Vol. 19(4): 393-406.

See also papers by Nick Epley and Tom Gilovich such as "The Anchoring-and-Adjustment Heuristic: Why the Adjustments are Insufficient" 2006 in Psychological Science.

Anchoring and Priming: Anchoring "is a case of suggestion. This is the word we use when someone causes us to see, hear, or feel something by merely bringing it to mind" (Kahneman 2012).

See papers by Thomas Mussweiler and Fritz Strack such as "Explaining the Enigmatic Anchoring Effect: Mechanisms of Selective Accessibility" 1997 in Journal of Personality and Social Psychology.

Affect Heuristic, more from the less wrong blog. Paul Slovic's paper.

Availability: What do people do when they "wish to estimate the frequency of a category such as 'people who divorce after the age of 60' or 'dangerous plants'. . . Instances of the class will be retrieved from memory, and if retrieval is easy and fluent, the category will be judged to be large..." The availability heuristic is the "process of judging frequency by the 'ease with which instances come to mind'" (Kahneman, 2012). Here's a video example.

See papers by Norbert Schwarz et al., such as "Ease of retrieval as information: Another look at the availability heuristic" 1991 in Journal of Personality and Social Psychology.

Confirmation Bias: our automatic brains favor uncritical acceptance of suggestions and exaggerations about the likelihood of extreme and improbable events.

Halo Effect (aka "exaggerated emotional coherence"): "The tendency to like (or dislike) everything about a person (Kahneman 2012). Check out a great example here provided by Kahneman -- what do you think of Alan and Ben?   

Alan: intelligent--industrious--impulsive--critical--stubborn--envious

Ben: envious--stubborn--critical--impulsive--industrious--intelligent

According to Kahneman, the halo effect is also an example of "suppressed ambiguity" -- the adjective stubborn is ambiguous and the context will determine how we interpret its meaning. Note that you probably thought it was a good characteristic in Alan, but a terrible characteristic in Ben.

Associations: Associative activation -- "ideas that have been evoked trigger many other ideas, in a spreading cascade of activity in your brain" (Kahneman, 2012).

Cognitive Ease: if we have seen something before, or been exposed to something in the past (however briefly) or if something is made relatively easier to read or understand, or if we are exposed to something repeatedly, we are more likely to remember it. No big deal, right? Well, it tends to influence our decisions and preferences, making it less likely that we will make rational decisions. Want to sell some music on itunes? Get the song on the radio. If it is played enough, people will like it, regardless of how good it is.

Robert Zajonc came up with the notion of the mere exposure effect. What is important about his research is that he showed that the effect of repetition on liking is a "profoundly important biological fact, and that it extends to all animals" (Kahneman, 2012).

Framing or Priming: Different ways of presenting the same information often evoke different emotions.

John Bargh's Experiment: asked students to assemble four-word sentences from a set of five words (ex: "finds he it yellow instantly"). For one group of students, half of the scrambled sentences contained words associated with the elderly (Florida, forgetful, bald, gray, wrinkle). When completed, participants were asked to complete another experiment down the hall, unaware that what was really measured was the time it took to get there. Those with the words that invoked getting old, walked slower. This notion that we can influence an action through an idea is called the "ideomotor effect."

Here are a bunch of videos of more of John Bargh's experiments on Priming.

Kathleen Vohs has run experiments that suggest that "living in a culture that surrounds us with reminders of money may shape our behavior and our attitudes in ways that we do not know about and of which we may not be proud" (Kahneman, 2012). Here is an example of a papers of hers.

Melissa Bateson and Daniel Nettle conducted an experiment on deterring litterbugs by putting a picture of staring eyes in a cafe. They can show that people contribute more to a kitty, and litter less!

Overconfidence: People's automatic responses don't care about the quantity or quality of evidence available to them. Their level of confidence depends on the quality of the story they can tell about what they see (Kaheneman 2012).

The Endowment Effect: In terms of economics, this is a critique of indifference curves that indicate that all locations on an indifference curve are equally attractive. In an indifference curve between time and money (labor v. leisure), what is missing is one's CURRENT income and leisure. That is the reference point, the status quo. Without pointing that out, we assume it doesn't matter. But, of course, it does.

Note that the loss-aversion part of prospect theory (see below) can explain the endowment effect.

While the mugs experiment is the typical example of the endowment effect, Knetsch shook it up a little by having two classes fill out a questionnaire and were rewarded with a gift of either an expensive pen or a bar of Swiss Chocolate. At the end of the class, the experimenter showed the alternative and allowed anyone to trade... Only 10% did.

Another classic example of the Endowment Effect is playing out in the housing market. In a normal housing market, people value their own houses more than is justified by what the market will pay (often about 12% over the market price). During a market downturn, the normal homeowner tries to sell their house for an average of 33% over market value (per Hersh Shefrin on NPR, March 30, 2008).

John List showed that the endowment effect goes away as trading experience increases.

Endowment Effects in Chimpanzees (Brosnan et al,).

Endowment Effect and Kids (Harbaugh et al.)

The Endowment Effect and Legal Issues

The Endowment Effect and Decision Making Under Poverty

Decisionmaking Under Uncertainty: What are people's attitudes toward risky alternatives? Kahenman wondered what rules govern people's choices between different simple gambles and between gambles and sure things. "Simple gambles (such as '40% chance to win $300') are to students of decision making what the fruit fly is to geneticists" (Kahneman 2012).

Prospect Theory: This is Kahneman and Tversky's behavioral economic theory that describes decisions between alternatives that involve risk, where the probabilities of outcomes are known.  It is a theory that modifies expected utility theory (Bernoulli figured out that people's choices are not based on dollar values but on the psychological values of outcomes and their utilities -- i.e. generally people prefer to avoid risk and discount the value of a gamble by that risk). The theory says that people make decisions based on the potential value of losses and gains rather than the final outcome, and that people evaluate these losses and gains using interesting heuristics. The model is descriptive: it tries to model real-life choices, rather than optimal decisions.

St. Petersburg Paradox: Bernoulli was able to explain why people who are offered a gamble that has infinite expected value are willing to spend only a few dollars for it.

Problem with the solution to the St. Petersburg Paradox -- it does not take into consideration reference-dependence.

Three cognitive features at the heart of prospect theory (according to Kahneman 2012). They are 'operating characteristics of' our automatic brain:

1. Evaluation is relative to a neutral reference point, which is sometimes referred to as an "adaptation level."

2. A principle of diminishing sensitivity applies to both sensory dimensions and the evaluation of changes in wealth

3. Loss aversion: when directly compared to each other, losses are felt more dramatically than gains. 

See the graph below (according to Kahneman, if prospect theory had a flag, this picture would be on it! Note the features of the graph that relate to points 1 through 3 above: the graph has two distinct parts to the right and left of a reference point, it is S shaped which represents diminishing sensitivity for both gains and losses, and the two curves are not symmetrical.

 

Loss Aversion: To measure the extent of your aversion to loss, ask yourself the following question -- "What is the smallest gain that I need to balance an equal chance to loose $100?" This is called the "loss aversion ratio" and for many people, it is at least double. Check out some monkey research related to loss aversion!

There are many applications of prospect theory and loss aversion. Here are a few:

Bad is Stronger than Good by Baumeister, Bratslavsky, Finkenaur and Vohs  discusses the salience of negative events over positive events.

Devin G. Pope and Maurice E. Schweitzer looked for loss aversion in professional golfers' performance on the PGA Tour. After looking at over 2.5 million putts, they can show that even the best golfers (including Tiger Woods) the players were more successful when they were putting for par than for a birdie!

Loss aversion and the law: David S. Cohen Jack L. Knetsch in "Judicial Choice and Disparities Between Measures of Economic Values" explore a wide range of entitlements and in a variety of contexts, show that individuals value losses more than foregone gains.

Possibility versus Certainty: Even if people understand probability and/or can assign logical/reasonable weights to uncertain events in making decisions, they assign value in a non-linear way. If an event goes from impossible to possible (for example, a 0% chance to a 1% chance) people will over-value that possibility. If an event goes from possible to certain, people will also over-value that certainty (for example, from 98% chance to 100% chance).

Allias's Paradox

Allias's Paradox on the Less Wrong Blog.

Summary of Prospect Theory: The Fourfold Pattern of Preferences

Representativeness: It was first proposed by Amos Tversky and Daniel Kahneman.[1] who defined representativeness as “the degree to which [an event] (i) is similar in essential characteristics to its parent population and (ii) reflects the salient features of the process by which it is generated” (Kahneman & Tversky, 1982, p. 33). When people rely on representativeness to make judgements, they are likely to judge wrongly because the fact that something is more representative does not make it more likely (Tversky & Kahneman, 1982). Here's a video example. Also related is the Conjunction Fallacy.

Two sins of representativeness (Kahneman 2012):

1. an excessive willingness to predict the occurrence of unlikely (low base-rate) events

2. insensitivity to the quality of evidence.

Here are a couple of interesting real-world experiments testing people's representative fallacies (the problems are just like the famous "Linda Problem" that identifies the conjunction fallacy:

Hsee, Christopher: "Less is Better: When Low-value Options Are Valued More Highly than High-value Options" Journal of Behavioral Decision Making, Vol. 11, 1998.

List, John: "Preference Reversals of a Different Kind: The 'More is Less' Phenomenon."

Base-rate neglect: We ignore relevant information & knowledge.

Intertemporal Choice

Self Control & Willpower

There is a great new book out: Willpower: Rediscovering the Greatest Human Strength by Roy F. Baumeister and John Tierney. This stuff links with Kahneman's System 1 and System 2 stuff mentioned above and in his book Fast and Slow Thinking. But one of the most interesting links is that "self-control and deliberate thought apparently draw on the same limited budget of effort" (Kahneman, 2012). This means that if I am trying to diet AND think hard about some problem at hand, I'm going to have a difficult time. My ability to choose the salad over the pizza (self control and willpower) will be diminished after I've spent some time working on some difficult computer code (using my System 2 brain). According to Kaheman, "People who are cognitively busy are more likely to make selfish choices, use sexist language, and make superficial judgments in social situations." This works for cognitive AND physical activity. It makes us tired.

Self Control and Cognitive Aptitude:

Walter Mischel and the Marshmallow Experiment

Shane Frederick and the Cognitive Reflection Test

Experienced Utility versus Decision Utility: On Memory, Happiness and Pain. Essential conclusions: people's evaluations of their lives and their actual experience may be related, but they are also different.

Kahneman on Youtube: the riddle of experience v. memory.

Study by Daniel Kahneman and Donald Redelmeier about how colonoscopy patients remember the pain of the procedure, and how that memory can be manipulated (to dim the memory of the pain) so that patients aren’t reluctant to return for their next colonoscopy.

Here's the Freakonomics Radio story about the study. And here's the Freakonomics Blog that has more of Redelmeier's research on PAIN (and lots of hockey references!)! The New York Times highlighted Redelmeier's cool research.

Here's the study itself: Memories of colonoscopy: A randomized trial.

Kahneman: experienced utility and objective happiness

Kahneman and Thaler: Anomolies: Utility maximization and experienced utility

Experienced Well-Being

SOCIAL/COLLECTIVE DECISION MAKING

Decorrelate Error: One thing that groups can do that individuals cannot is "decorrelate error". What that means is that while individuals will make mistakes in their judgments, if you look at a group of individuals (all with errors) and the errors are independent of each other, the errors will tend to average to zero. This was one of the important ideas discussed in James Surowiecki's book The Wisdom of Crowds (see above).

Altruism, Fairness, Trust, Reciprocity, Punishment and Cooperation

PUBLIC POLICY, LAW AND BEHAVIORAL ECONOMICS