Tag Archives: Dan Heath

Debiasing Decisions: Combat Confirmation Bias, Overconfidence Bias

Philip Meißner

Philip Meißner

Cognitive and behavioral biases can contribute to “blind spots” in decision-making, leading to less effective outcomes.
To improve decision outcomes, University of Marburg ’s Philip Meißner, Torsten Wulf of HHL Leipzig Graduate School of Management and HEC’s Olivier Sibony proposed a systematic checklist to identify potential decision derailment based on bias, along with rapid remedies.

Torsten Wulf

Torsten Wulf

They argues that two types of bias contribute to most decisions that lead to undesirable results:

  • Confirmation bias, the unconscious tendency to believe new information that is consistent with existing beliefs and recent experiences, and to discount contradictory data,
  • Overconfidence bias, the out-of-awareness likelihood to overestimate one’s skills, insights, and judgment.
    This leads to increased risk-taking based on illusory sureness of the decision and ability to mitigate adverse outcomes.
Olivier Sibony

Olivier Sibony

Previously, Lovallo and Sibony articulated four related decision biases:

  • Pattern-recognition biases, countered by changing the “angle of vision,”
  • Action-oriented biases, mitigated by recognizing uncertainty,
  • Interest biases, minimized by explicitly highlighting these interests,
  • Social biases, reduced by depersonalizing debate.

Debiasing techniques such as checklists, can limit the negative effects of biases in decision-making by offering a disciplined, comprehensive analysis of downside risks and by systematically considering multiple viewpoints.

Atul Gawande

Atul Gawande

However, effectively implementing checklists requires consistent discipline, noted Harvard’s Atul Gawande, who cited examples of partial adherence leading to costly oversights and failures.

One approach, suggested by Princeton’s Daniel Kahneman and Gary Klein of McKinsey, is a “premortem.”
Decision makers imagine that the decision has failed and analyze sources and reasons for adverse outcomes, to more thoroughly assess points of failure and possible mitigation strategies.
Formal scenario-planning is another way to expose assumptions underlying a plan, as well as a competitor’s priorities and potential strategy.

Massimo Garbuio

Massimo Garbuio

Using a variety of debiasing techniques significantly increased the Return on Investment (ROI) in a study by University of Sydney’s Massimo Garbuio and Dan Lovallo and Olivier Sibony of HEC.
As a result, Michael Birshan, Ishaan Nangia, and Felix Wenger of McKinsey, argued that debiasing techniques should be embedded in formal organizational decision-making processes, particularly for high-impact, repetitive decisions.

Michael Birshan

Michael Birshan

Decision biases may be out of awareness, or unconscious, so it’s more effective to evaluate the process of developing a proposal, rather than focusing only on the content and merits of a proposal.

Decision-making safeguards can be built into standard analysis processes by including questions to expose:

  • Multiple data sources,
  • Diverse opinions and perspectives,
  • Downside risk,
  • Potential negative outcomes for company, industry, and broader ecosystem.
Daniel Kahneman

Daniel Kahneman

Proposals are considered ready for a decision only when multiple perspectives are available to mitigate confirmation bias and risk analysis is available to reduce overconfidence bias.
Responses to decision checklist questions can be quantified to indicate one of four action steps, according to Daniel Kahneman:

  • Decide, based on inclusion of robust safeguards against both confirmation bias and overconfidence bias,
  • Screening MatrixReach out, suggesting the need for gathering additional perspectives, opinions, and perspectives to prevent narrow assumptions to reduce confirmation bias.
    The Vanishing-Options Test, proposed by Stanford’s Chip Heath and Dan Heath of Duke University, can generate new ideas by imagining that none of the current proposals are available.
  • Stress-test, by conducting a pre-mortem or analysis by external devil’s advocate or provocateur to reduce overconfidence risk by.
  • Reconsider when both more perspectives and risk analysis are required to reduce both overconfidence bias and confirmation bias.
    This screening matrix helps reduce related decision-making biases:
  1. Self-interest Bias
    -To what extent is the proposal motivated by self-interest?
Ishaan Nangia

Ishaan Nangia

-Assess for over-optimism

  1. Affect Heuristic
    -How strong is the team’s emotional attachment to a specific proposal?
    -To what extent were risks and costs fully considered for both preferred and non-preferred options?

-Assess for strongly-preferred outcomes
-Reintroduce analysis of all options

  1. Groupthink
    -How many dissenting opinions were analyzed?
    -How adequately were all options explored?
    -Was dissent discouraged? 
Felix Wenger

Felix Wenger

-Encourage substantive disagreements as a valuable part of the decision process
-Solicit dissenting views from members of the recommending team, through private meetings

4. Saliency Bias
     -To what extent are decisions made based on a potentially incomparable, but memorable success?
     -What about the proposed analogy is comparable to the current situation?
     -What are relevant examples from less successful companies? What happened in those cases?

Decision Making QuestionsRecommendation
-Carefully scrutinize analogies’ similarity to the current decision situation
Solicit additional analogies using reference class forecasting:

.Select reference class,
.Assess distribution of outcomes,
.Intuitively estimate project’s position in distribution,
.Assess estimate’s reliability,
.Correct intuitive estimate.

  1. Confirmation Bias
    -What viable alternatives were included with the preferred recommendation?
    -At what stage in the decision analysis were alternatives discarded?
    -What efforts were undertaken to seek information to disconfirm the main assumptions and hypotheses?

-Request two additional alternatives to the main recommendation, including analysis of benefits and drawbacks
-Acknowledge unknowns, risks

  1. Availability Bias

    Max Bazerman

    Max Bazerman

    If you had more time to gather date, what information would you seek?, asked Harvard’s Max Bazerman
    -How can you access similar data now?

-Use checklists to ensure comprehensive analysis of data required for each decision type

  1. Anchoring Bias
    -What data sources are used to analyze decision?
    -Which data are estimates? By whom? If so, from which data were estimates extrapolated?
    -To what extent could there be:
  • Unsubstantiated numbers?
  • Extrapolation from non-equivalent previous situations?
  • Attraction to specific anchors?

-Present data from other sources, benchmarks, or models
-Request new analysis

8. Halo Effect
     -To what extent does the analysis team expect that a person, organization, or approach previously successful in one context will be equally effective in different situation?

Phil Rosenzweig

Phil Rosenzweig

-Question potentially inaccurate inferences
-Solicit additional comparable examples
-Question attributions of success and failure to leaders’ personalities instead of chance factors, advised IMD’s Phil Rosenzweig.

9. Sunk-Cost Fallacy, Endowment Effect
     -To what extent are recommenders attached to past decisions?

Disregard past expenditures when considering future costs and revenues

  1. Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect
    -To what extent is the comparison case unwarrantedly optimistic?

-Adopt an outside view by using relevant simulations or war games

  1. Disaster Neglect
    -To what extent is the worst case scenario realistically and sufficiently negative?
    -How was the worst case generated?
    -To what extent does the worst case consider competitors’ likely responses?
    -What other scenarios could occur?

-Conduct a premortem, suggested by Gary Klein of Applied Research Associates:  Imagine the worst case scenario occurred, then propose likely causes, mitigations   

  1. Loss Aversion
    -To what extent is the evaluation and decision team risk averse?

-Realign incentives to share responsibility for the risk or to reduce risk

  1. Planning Fallacy focuses only on the current case while ignoring similar projects’ history and statistical generalization from related cases.
    -To what extent does the analysis rely on “top-down, outside-view” comparisons to similar projects?
    -Did the evaluators use a “bottom-up, inside-view” to estimate time required for each step?

-Statistically analyze a broad range of similar cases to avoid over-estimates from “top-down, outside-view” approaches and underestimates from “bottom-up, inside-view”
-Differentiate accurate forecasts from ambitious targets

  1. Loss aversion
    -To what extent are evaluators more concerned with avoiding loss than achieving gains?
    – How concerned are evaluators with being held responsible for a failed project?
    -To what extent has the organization specified acceptable risk levels?

-Seek risk tolerance guidelines from organizational leaders.

Decision-making tools like checklists can significantly reduce unconscious biases, provided that they are consistently and systematically applied.

-*What strategies have you found most helpful in reducing biases in decision-making?

Follow-share-like http://www.kathrynwelds.com and @kathrynwelds

Related Posts:

Twitter:  @kathrynwelds
Google+LinkedIn Groups Psychology in Human Resources (Organisational Psychology)
Blog:  Kathryn Welds | Curated Research and Commentary

©Kathryn Welds



Reduce “Affective Forecasting” Errors with a Geographic Cure?

People must often make “affective predictions” about choice of life partner, occupation, residence, yet most everyone makes small, but systematic errors in forecasting personal emotional responses.

These misjudgments can negatively affect personal health, happiness, financial well-being, and interpersonal relationships.

Kostadin Kushlev

Kostadin Kushlev

University of British Columbia’s Kostadin Kushlev and Elizabeth Dunn identified these decision biases, and noted that one of the most well-known and widely-occurring affective forecasting errors is impact bias, the tendency to overestimate the intensity of emotional responses to future positive and negative events.

Elizabeth Dunn

Elizabeth Dunn

In addition, Kushlev and Dunn reported that people tend to overestimate the duration of future emotional reactions, labeled durability bias.

Seymour Epstein

Seymour Epstein

Also known as “focalism,” durability bias can occur when people rely on the “rational system” for information processing, according to Seymour Epstein of University of Massachusetts.

His Cognitive-Experiential Self Theory proposes that the “rational system” is used to make affective forecasts, and typically processes information slowly, analytically and abstractly.

Seymour Epstein-CESTIn contrast, the “experiential system” of information processing operates rapidly, associatively, holistically, and concretely.

Shifts between rational (“cold”) and experiential (“hot”) decision systems can cause another bias, “Empathy gap.”

Epstein posits that rational system processing can lead to imagining the event isolated from its broader context that may mitigate its emotional impact.
In this situation, it is easy to focus on distinctive, observable characteristics, and to overvalue these due to their availability rather than their actual future impact.

Relying on the rational system may lead to another error, immune neglect, when people underestimate their likelihood of later reinterpreting future events to reduce negative feelings.

Anna Freud

Anna Freud

Epstein refers to this self-care process as the “psychological immune system” that enables recovery from negatively-tinged emotional events.
This is a more positive reinterpretation of Anna Freud’s focus on defense mechanisms.

Another predictive error, underestimating the power of future affective states can occur when people don’t consider the impact of physical states like hunger and thirst.

Habit-control programs like Alcoholics Anonymous implicitly recognize the tendency to underestimate future emotional states by urging participants to “HALT” while they consider whether problematic urges stem from being “Hungry, Angry, Lonely, Tired.”

Sometimes forecasting errors are based on inaccurate theories about the determinants of happiness, such as being able to reverse decisions or having more choices.
In addition, people often overlooked the influence of their own dispositions, such as optimism,  in predicting future feelings.
The result of this error, “personality neglect,” can lead to overestimates of future happiness by people who score high on the personality characteristic “neuroticism.”

Roger Buehler

Roger Buehler

Despite people’s imperfect ability to predict future emotions, whether happy or unhappy, people who expect positive emotions in the future report greater present satisfaction, according to Wilfrid Laurier University’s Roger Buehler, Vassili Spyropoulos and Kent C. H. Lam with Cathy McFarland of Simon Fraser University.

Even if fueled by another thinking error, optimism bias, positive anticipation improves the present moment and may play a central role in each individual’s psychological immune system.

Biases and thinking errors in considering future emotional reactions can be minimized by:

  • Defocusing on the anticipated emotional occurrence
  • Considering emotional outcomes in similar previous experiences
  • Anticipating  consequences of other simultaneous future events
  • Previewing the future state with feedback from others
Chip Heath-Dan Heath

Chip Heath-Dan Heath

Similarly Stanford’s Chip Heath and Dan Heath of Duke Corporate Education suggest mitigating decision bias with WRAP:

  • Widen your options
  • Reality-check your options
  • Attain distance before deciding
  • Prepare to be wrong.
Kristin Weger

Kristin Weger

University of Alabama at Huntsville’s Kristin Weger and Sandra Carpenter demonstrated that “guided flexivity” or “structured reflection” can improve performance on simulation game tasks over multiple trials.
They found that volunteers reduced errors in predicting future emotions by evaluating expectations in comparison to actual experience during a “post-mortem” session to review “lessons learned.”

Sandra Carpenter

Sandra Carpenter

Wegner and Carpenter found that guided reflexivity increased individuals’ awareness of their roles as well as others’ expertise and responsibilities in the target situation.

Other strategies to improve performance and decisions require even more commitment, like finding that living in a more “interdependent” culture like Japan for even a year.

This type of “geographic intervention” results in increased people’s consideration of contextual factors in decision making and creativity.

-*Time to book a flight?

-*How accurate are you in predicting your feelings about a specific choice or situation in the future?

-*How do you detect and mitigate bias in predicting your future emotional reactions?

-*What positive and negative impacts have you observed in affective forecasting errors?

Related Posts


Twitter:   @kathrynwelds
BlogKathryn Welds | Curated Research and Commentary
LinkedIn Open Group Psychology in Human Resources (Organisational Psychology)
Facebook Notes:

©Kathryn Welds

Memorable Business Stories: Ideas and Numbers

Chip Heath-Dan Heath

Chip Heath-Dan Heath

Chip Heath of Stanford and Dan Heath, Senior Fellow at Duke University’s CASE center, distill principles that make messages memorable in  Made to Stick: Why Some Ideas Survive and Others Die

Citing urban legends and advertisements as examples of tenaciously “sticky” messages, they argue that unforgettable ideas can be recalled with an acronym that means “success” in French:   Made to Stick

  • Simplicity
  • Unexpectedness
  • Concreteness, with many details to act as “hooks” to “stick” to  memory’s many “loops” (Velcro theory of memory)
  • Credibility
  • Emotion-laden stories.
Robert Cialdini

Robert Cialdini

The Heaths’ principle of credibility draws on the three elements of persuasive messages outlined by Robert Cialdini in his best-selling Influence: The Psychology of PersuasionInfluence

Credibility is enhanced by liking, authority, and social proof in Cialdini’s model:

  • Liking – Appealing public figures or personal friends endorses
  • Authority – Well-respected role model or respected authority provides testimonial
  • Social proof – Others like me endorse it, and others provide justification: “because…”, though the actual reason is immaterial
  • Reciprocity – “I know you’d do the same for me,” recommended by Guy Kawasaki to convey that “You owe me…”
  • Scarcity – “While supplies last…”, “Limited time offer!”, “Act now, don’t wait!”
  • Commitment, consistency – Draws on people’s desire to appear consistent, and even trustworthy by following through on commitments: “I do what I say I will do…”
  • Contrast principle – Sales people sell the most expensive item first so related items seem inexpensive by comparison: Real estate transaction fees may appear minimal in contrast to a large investment in a house.

Both memorable messages and persuasive messages take advantage of habitual reactions to typical situations.

These automated and sometimes unconscious processes are a heuristic to help people to deal rapidly and efficiently with routine activities and tasks.
However, “auto-pilot” reactions  may lead to being persuaded to act in ways that might not be helpful, such as excessive eating, drinking, spending, or engaging in risky activities.

Jonah Berger

Jonah Berger

ContagiousWharton’s Jonah Berger formulated an acronyn, STEPPS, to describe narrative elements that increase the likelihood that a story, idea, or product will spread like a contagious virus: 

  • Social Currency – Passing along the information makes the sender appear “good” – knowledgeable, helpful or other   
  • Triggers – The message evokes a familiar, frequent situation
  • Emotion – The story evokes emotion, so will strengthen the emotional between the sender and receiver   
  • Public – Similar to Social Currency, passing the message reflects favorably on the sender
  • Practical Value – The sender provides actionable value in sharing the message
  • Stories –  Memorable, surprising elements increase the likelihood that others will convey the message
Randall Bolten

Randall Bolten

Finance executive Randall Bolten draws on similar observations about human cognitive and perceptual processing to recommend ways to tell a memorable and motivating quantitative story.

His Painting with Numbers: Presenting Financials and Other Numbers So People Will Understand You, discusses “quantation” as another type of business storytelling that affects  “personal brand image.”Painting with Numbers

Edward Tufte

Edward Tufte

Even more practical than Edward Tufte’s breathtaking examples of effective “information architecture” in The Visual Display of Quantitative Information and Envisioning Information, Bolten provides coaching on designing memorable, persuasive presentations and “pitches” featuring quantitative information as “proof points.”

His book demonstrates the Heaths’ principles of simplicity, concreteness, and credibility while drawing on Cialdini’s proven approaches of authority, commitment, consistency, and contrast. The Visual Display of Quantitative InformationEnvisioning Information

-*What principles do you use to tell stories that motivate others to act as you hope?

Related Posts:

Twitter:   @kathrynwelds
Blog: – Kathryn Welds | Curated Research and Commentary
LinkedIn Open Group Mindful leadership
Facebook Notes

©Kathryn Welds