Behavioural economics integrate psychology and sociology into economics. It is based on empirical research rather than casual observation about human behaviour.
People frequently make poor decisions in situations that involve:
- complicated calculations
- risk and uncertainty
- tradeoffs between present and future
We use two distinct systems in decision making:
- the Automatic system: used for quick, simple decisions (thinking fast)
- the Reflective system: used for more complicated problems (thinking slow)
The problem is that we often rely on the wrong system in the wrong instances.
Status Quo Bias
We are likely to stay with our current option, even if it was chosen for us.
E.g. “I figured out this job wasn’t the best for me, but it pays the bill and I hate looking for a job”
- It is very important to define appropriate default options to ensure people are “sticking” with the best possible option.
- We need to push that much harder to change once a position has been “stuck” to.
Humans do not like losing something. Even if it’s something they don’t hold dear. The perceived value goes up the moment it is “owned”.
E.g. “I wasn’t overjoyed to get a work branded travel mug, but, no, I wouldn’t give it to you!”
- We need to frame decisions to minimize the perceived losses
- Our incentives must be set up so that a net gain is perceived
Anchoring: Given a point of reference, guesses and estimations will be affected by it.
E.g. “The average car costs $45k, how much do you think car XYZ costs?”
- Anchor our propositions accordingly to give them the best shot at being perceived as good.
Framing: The way the information is presented has an effect as to it’s perception. We often stick to questioning our impression of that information, not the information itself.
E.g. “90% chance of survival after 5 years” and “10% chance of death within 5 years” say the same thing, but one looks considerably worse.
- Frame information to support your argument, whether to confirm or infirm.
Availability: The most recent information we have will taint our perception. Alternatively, the one most closely related to us.
E.g. “I just bought a Volvo, now I see them everywhere!”
- Make sure the information put forward always plans for and includes nuggets that could be useful for future communications efforts.
- Relate to things really near and dear to people.
Salience: The information with the most impact is likely to stay with us, even if it’s about fringe occurrences.
E.g. “Driving on ice is dangerous” is not as impactful as “50% of accidents last winter were due to ice”
- Find a hook to support your argument, something that will stick
Given too many choices, we are more likely to give up choosing and move on to something else.
E.g. When choosing between 3 brands of ketchup, we know which we like best. If presented with 90, we may decide to buy some next time instead.
- Include a limited number of choices to avoid overwhelming
Offering feedback as a decision is made will make us feel good (or bad) about our choices, and may inform repeated behaviours.
- Always include feedback after any interaction, to encourage future interactions.
Mapping information in a way that is easier to understand is more likely to have an impact.
- “Map” information in easily understood units such as dollars
We care much more about the present than we do about the future. Trading off short term benefits for long term one is hard to do.
E.g. Eating junk food has an immediate positive impact and long term negative one. We are often likely to ignore the long term one. “Future us will take care of it”.
- Given asynchroneous costs and benefits, we need to work harder to justify the present day effort to justify the long term reward.
- We need to provide more encouragement to stick with such a decision, by clarifying long term impact or, even better, by finding a way to put attainable milestones along the way.
We tend to focus on information that confirms our beliefs.
- The need for convincing information is much greater to change people’s beliefs
- In our own work, we should actively challenge our perception to make sure we’re not unreasonably set in our ways. Build a solid counter argument and go from there.
We are affected by what other people think, or even moreso, by what we think other people will think. We also tend to do what we think most people are doing
- Present information in a way that makes it seem that most people are already displaying the desired behaviours
Rather than answering a “hard question”, people often think of an easier question and “adjust” its answer to try and solve the harder question.
Overconfidence and optimism
People tend to overestimate positive results in general. People tend to be more likely to think they will land in that positive slice of possibilities.
In most situations, there has to be a structure to the presented choices. That structure matters a lot when it comes to people’s decisions.
By structure the choices in a specific way to get specific results, policy makers act as Choice architects.
Back to nudges
Nudges do not forbid options, they allow freedom of choice, but they point the way should the person making the decision not have a firm opinion.
Good nudges can have high benefits with low costs while simplifying policies and programs.
Behavioural economics support, does not supplant, conventional economics. They give additional tools.
Behavioural biases of policy makers
Let’s not rock the boat – Status quo bias
- Policy makers cling to the prevailing assumptions.
- They stick with the first idea that was found to “make sense”.
- Accepting new, conflicting information doesn’t come easy.
- Given the possibility of group think, the status quo may be even more bitterly protected.
I still think that my idea is better – Endowment effect
- Putting the effort into doing something gives a higher perception of value that should be warranted.
- “It’s mine, I worked hard on it, therefore it is the best course of action”
Here is another fact that supports what I said – Confirmation bias
- Selectively focusing on information that validates the position the policy maker argues for.
- Unreasonable discounting of contrary information.
- Can lead to catastrophic failure as more and more contrary information piles up.
The last thing we need is another X situation – Availability and Salience biases
- High profile events have a disproportionate influence, skewing the perception of the real situation at hand.
My gut tells me to go with option A – Confirmation bias
- People are more likely to remember when they were right than when they were wrong.
- Overly optimistic about their abilities and policies
Dealing with those biases
- Awareness of the biases is the first step.
- Building in a review process to critically look at the actual impacts.
- Regular consultations with stakeholders.
- Use cost-benefits analysis and other formal evaluative techniques.
Starting with a brain teaser to illustrate integrative thinking:
You are driving your 2-seater sports car in the pouring rain. Stopping at a red light, you notice three individuals at the bus stop:
- An old friend, who saved your life years ago;
- An elderly woman, who clearly needs to go to the hospital; and
- The person of your dreams
What do you do?
Integrative thinking is:
the ability to constructively face the tensions of opposing models, and instead of choosing one at the expense of the other, generating a creative resolution of the tension in the form of a new model that contains elements of the individual models, but is superior to each.
It can be learned.
It’s an iterative process comprising 4 steps:
- Salience: More features of the problem are considered salient
- Causality: Multidirectional and non-linear
- Architecture: Whole problem is always kept in mind while working on the individual parts
- Resolution: Search for creative resolution of tensions
- Make problems “messier” by taking a broader view of what is salient
- Welcome complexity and are open to non-linear correlations
- Work on independent pieces, but never split off from the big picture
- Refuse to accept unpleasant trade-offs, look for other solutions (may end up in delays)
Don’t assume everyone shares your view of the world.
[I]s powered by a thorough understanding, through direct observation, of what people want and need in their lives …
– Tim Brown, CEO of IDEO
Three spaces of design thinking
- Inspiration: accumulate information
- Ideation: prototype many possibilities
- Implementation: implement the chosen design
- Starting with user needs
- Designing with data
- Focusing on innovation
- Fail fast, early, and cheap
Design thinking: Business innovation
Comprised of four basic steps:
- Preliminary immersion: understanding problem, scope and boundaries
- In-depth immersion: better understand the needs
- Ideation: generate innovative ideas and select the ones that meet the goals
- Prototype: Fail early, fail often to succeed
An iterative process, consisting of 5 steps:
- Empathize: We need to understand. Cross-examine filters used. Suspend judgment
- Define: Frame the problem statement. Words used matter, get to the core UX.
- Ideate: Create and consider many options
- Prototype: Refine, nurture, evolve, scrap/combine as needed
- Test: New ideas may be fragile. Design a good test. Commit to a choice
- Have no definitive formulation.
- Hard, maybe impossible, to measure or claim success
- Solutions can only be good or bad, not true or false. There is no idealized end state to arrive at. The goal is improvement, not solution.
- There is no template to follow
- There is always more than one explanation. They vary by individual perspective and bias
- Every wicked problem is a symptom of another problem. Interconnectedness abound
- No mitigation strategy has a definitive scientific test
- “Solutions” are frequently a “one shot” design effort as they may change the design space enough to limit trial and error
- They are all unique
- Designers must be fully responsible for their actions
Akerlof, George and Rachel Kranton (2011) Identity economics: How our identities shape our work, wages and well-being
Ariely, Dan (2009) Predictably irrational: The hidden forces that shape our decisions
Berger, Jonah (2013) Contagion: Why things catch on
Cassidy, John (2010) How markets fail: The logic of economic calamities
Chabris, Christopher and Daniel Simons (2011) The invisible gorilla: How our intuition deceives us
Harford, Tim (2012) Adapt: Why success always starts with failure
Heath, Chip and Dan Heath (2007) Made to stick: Why some ideas survive and others die
Heffernan, Margaret (2011) Wilful blindness: Why we ignore the obvious at our peril
Kahneman, Daniel (2011) Thinking fast and slow
Kelley, Tom (2005) The ten faces of innovation: IDEO’s strategies for driving creativity throughout your organizations
Kelley, Tom and David Kelley (2013) Creative confidence: Unleashing the creative potential within us all
Kieboom, Marlieke (2013) Lab matters: Challenging the practice of social innovation laboratories
Sunstein, Cass (2013) Simpler: The future of government
Surowiecki, James (2005) The wisdom of crowds
Thaler, Richard and Cass Sunstein (2009) Nudge: Improving decisions about health, wealth, and happiness
Martin, Robert (2009) The opposable mind: Winning through integrative thinking
Brown, Tim (2009) Change by design: How design thinking transforms organizations and inspires innovation
Liedtka, Jeanne, Andrew King and Kevin Bennett (2013) Solving problems with design thinking: Ten stories of what works
Martin, Robert and Karen Christensen (2013) Rotman on Design: The best on design thinking from Rotman magazine
Vianna, Mauricio et al. (2012) Design thinking: Business Innovation