Tag Archives: Atul Gawande

Performance Excellence linked to Preventing Failures, Corrective Coaching

Atul Gawande

Simple behavior changes, such as following a structured checklist, can prevent medical care mistakes and increase care quality, found Harvard’s Atul Gawande.

People who recognized fallibility in organizational processes, and took proactive steps to remedy these shortcomings, more effectively improved their performance.

Three elements of better performance can be applied to fields outside of medicine:

  • Diligence – Attending to details can prevent errors and overcome obstacles.
    Gawande’s The Checklist Manifesto: How to Get Things Right suggests best ways to structure these memory aids,
  • Doing Right –Ensuring that skill, will, and incentives are aligned to drive excellent performance,
  • IngenuityDeliberately monitoring potential failures, continuously seeking innovative ways to improve performance and solutions.

These elements can be improved with attentive observation and feedback to prevent errors of omission when people don’t:

  • Know enough (ignorance),
  • Make proper use of what they know (ineptitude).

Ignorance occurs less frequently than ineptitude because relevant information is widely available, Gawande noted.
He argued that both types of omission errors can be improved by systematic analysis and disciplined use of tools like checklists.

Geoffrey Smart

Checklist-based analysis was also linked to Internal Rate of Return (IRR) in Geoffrey Smart’s study of investments by Venture Capital (VC) firms,

He found a correlation between IRR and leadership effectiveness in new investment ventures.
Since selecting capable leaders is critical to business outcomes, Smart also evaluated VC firms’ typical approach to assessing potential leaders:

  • The Art Critic is the most frequently-used approach in which the VC assesses leadership talent at a glance, intuitively, based on extensive experience,
  • The Sponge conducts extensive due diligence, researching and assimilating information, then decides based on intuition,
  • The Prosecutor interrogates the candidate, tests with challenging questions and hypothetical situations,
  • The Suitor woos the candidate to accept the leadership role instead of analyzing capabilities and fit,
  • The Terminator eliminates the evaluation because the venture firm replaces the company’s originators,
  • The Infiltrator becomes a “participant-observer” in an immersive, time-consuming experientially-based assessment,
  • The Airline Captain uses a formal checklist to prevent past mistakes.This last approach was linked to the highest average Internal Rate of Return (IRR) for the new ventures.
    In addition, this strategy was significantly less likely to result in later terminating senior managers.

Venture Capitalists in his studies reported that two of their most significant mistakes were:

  • Investing insufficient time in talent analysis,
  • Being influenced by “halo effect” in evaluating candidates.

Systematic reminders to execute all elements required for expert performance can prevent failure and signal potential failure points.

-*How do you improve performance?
-*What value do you find in expert coaching?

Related Post:
Developing a SMARTER Mindset for Resilience, Emotional Intelligence – Part 2

Twitter: @kathrynwelds
Facebook Notes:

©Kathryn Welds

 

Advertisements

Debiasing Decisions: Combat Confirmation Bias, Overconfidence Bias

Philip Meißner

Philip Meißner

Cognitive and behavioral biases can contribute to “blind spots” in decision-making, leading to less effective outcomes.
To improve decision outcomes, University of Marburg ’s Philip Meißner, Torsten Wulf of HHL Leipzig Graduate School of Management and HEC’s Olivier Sibony proposed a systematic checklist to identify potential decision derailment based on bias, along with rapid remedies.

Torsten Wulf

Torsten Wulf

They argues that two types of bias contribute to most decisions that lead to undesirable results:

  • Confirmation bias, the unconscious tendency to believe new information that is consistent with existing beliefs and recent experiences, and to discount contradictory data,
  • Overconfidence bias, the out-of-awareness likelihood to overestimate one’s skills, insights, and judgment.
    This leads to increased risk-taking based on illusory sureness of the decision and ability to mitigate adverse outcomes.
Olivier Sibony

Olivier Sibony

Previously, Lovallo and Sibony articulated four related decision biases:

  • Pattern-recognition biases, countered by changing the “angle of vision,”
  • Action-oriented biases, mitigated by recognizing uncertainty,
  • Interest biases, minimized by explicitly highlighting these interests,
  • Social biases, reduced by depersonalizing debate.

Debiasing techniques such as checklists, can limit the negative effects of biases in decision-making by offering a disciplined, comprehensive analysis of downside risks and by systematically considering multiple viewpoints.

Atul Gawande

Atul Gawande

However, effectively implementing checklists requires consistent discipline, noted Harvard’s Atul Gawande, who cited examples of partial adherence leading to costly oversights and failures.

One approach, suggested by Princeton’s Daniel Kahneman and Gary Klein of McKinsey, is a “premortem.”
Decision makers imagine that the decision has failed and analyze sources and reasons for adverse outcomes, to more thoroughly assess points of failure and possible mitigation strategies.
Formal scenario-planning is another way to expose assumptions underlying a plan, as well as a competitor’s priorities and potential strategy.

Massimo Garbuio

Massimo Garbuio

Using a variety of debiasing techniques significantly increased the Return on Investment (ROI) in a study by University of Sydney’s Massimo Garbuio and Dan Lovallo and Olivier Sibony of HEC.
As a result, Michael Birshan, Ishaan Nangia, and Felix Wenger of McKinsey, argued that debiasing techniques should be embedded in formal organizational decision-making processes, particularly for high-impact, repetitive decisions.

Michael Birshan

Michael Birshan

Decision biases may be out of awareness, or unconscious, so it’s more effective to evaluate the process of developing a proposal, rather than focusing only on the content and merits of a proposal.

Decision-making safeguards can be built into standard analysis processes by including questions to expose:

  • Multiple data sources,
  • Diverse opinions and perspectives,
  • Downside risk,
  • Potential negative outcomes for company, industry, and broader ecosystem.
Daniel Kahneman

Daniel Kahneman

Proposals are considered ready for a decision only when multiple perspectives are available to mitigate confirmation bias and risk analysis is available to reduce overconfidence bias.
Responses to decision checklist questions can be quantified to indicate one of four action steps, according to Daniel Kahneman:

  • Decide, based on inclusion of robust safeguards against both confirmation bias and overconfidence bias,
  • Screening MatrixReach out, suggesting the need for gathering additional perspectives, opinions, and perspectives to prevent narrow assumptions to reduce confirmation bias.
    The Vanishing-Options Test, proposed by Stanford’s Chip Heath and Dan Heath of Duke University, can generate new ideas by imagining that none of the current proposals are available.
  • Stress-test, by conducting a pre-mortem or analysis by external devil’s advocate or provocateur to reduce overconfidence risk by.
  • Reconsider when both more perspectives and risk analysis are required to reduce both overconfidence bias and confirmation bias.
    This screening matrix helps reduce related decision-making biases:
  1. Self-interest Bias
    -To what extent is the proposal motivated by self-interest?
Ishaan Nangia

Ishaan Nangia

Recommendation
-Assess for over-optimism

  1. Affect Heuristic
    -How strong is the team’s emotional attachment to a specific proposal?
    -To what extent were risks and costs fully considered for both preferred and non-preferred options?

Recommendations
-Assess for strongly-preferred outcomes
-Reintroduce analysis of all options

  1. Groupthink
    -How many dissenting opinions were analyzed?
    -How adequately were all options explored?
    -Was dissent discouraged? 
Felix Wenger

Felix Wenger

Recommendations
-Encourage substantive disagreements as a valuable part of the decision process
-Solicit dissenting views from members of the recommending team, through private meetings

4. Saliency Bias
     -To what extent are decisions made based on a potentially incomparable, but memorable success?
     -What about the proposed analogy is comparable to the current situation?
     -What are relevant examples from less successful companies? What happened in those cases?

Decision Making QuestionsRecommendation
-Carefully scrutinize analogies’ similarity to the current decision situation
Solicit additional analogies using reference class forecasting:

.Select reference class,
.Assess distribution of outcomes,
.Intuitively estimate project’s position in distribution,
.Assess estimate’s reliability,
.Correct intuitive estimate.

  1. Confirmation Bias
    -What viable alternatives were included with the preferred recommendation?
    -At what stage in the decision analysis were alternatives discarded?
    -What efforts were undertaken to seek information to disconfirm the main assumptions and hypotheses?

Recommendation
-Request two additional alternatives to the main recommendation, including analysis of benefits and drawbacks
-Acknowledge unknowns, risks

  1. Availability Bias

    Max Bazerman

    Max Bazerman

    If you had more time to gather date, what information would you seek?, asked Harvard’s Max Bazerman
    -How can you access similar data now?

Recommendation
-Use checklists to ensure comprehensive analysis of data required for each decision type

  1. Anchoring Bias
    -What data sources are used to analyze decision?
    -Which data are estimates? By whom? If so, from which data were estimates extrapolated?
    -To what extent could there be:
  • Unsubstantiated numbers?
  • Extrapolation from non-equivalent previous situations?
  • Attraction to specific anchors?

Recommendations
-Present data from other sources, benchmarks, or models
-Request new analysis

8. Halo Effect
     -To what extent does the analysis team expect that a person, organization, or approach previously successful in one context will be equally effective in different situation?

Phil Rosenzweig

Phil Rosenzweig

Recommendations
-Question potentially inaccurate inferences
-Solicit additional comparable examples
-Question attributions of success and failure to leaders’ personalities instead of chance factors, advised IMD’s Phil Rosenzweig.

9. Sunk-Cost Fallacy, Endowment Effect
     -To what extent are recommenders attached to past decisions?

Recommendation
Disregard past expenditures when considering future costs and revenues

  1. Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect
    -To what extent is the comparison case unwarrantedly optimistic?

Recommendation
-Adopt an outside view by using relevant simulations or war games

  1. Disaster Neglect
    -To what extent is the worst case scenario realistically and sufficiently negative?
    -How was the worst case generated?
    -To what extent does the worst case consider competitors’ likely responses?
    -What other scenarios could occur?

Recommendation
-Conduct a premortem, suggested by Gary Klein of Applied Research Associates:  Imagine the worst case scenario occurred, then propose likely causes, mitigations   

  1. Loss Aversion
    -To what extent is the evaluation and decision team risk averse?

Recommendation
-Realign incentives to share responsibility for the risk or to reduce risk

  1. Planning Fallacy focuses only on the current case while ignoring similar projects’ history and statistical generalization from related cases.
    -To what extent does the analysis rely on “top-down, outside-view” comparisons to similar projects?
    -Did the evaluators use a “bottom-up, inside-view” to estimate time required for each step?

Recommendation
-Statistically analyze a broad range of similar cases to avoid over-estimates from “top-down, outside-view” approaches and underestimates from “bottom-up, inside-view”
-Differentiate accurate forecasts from ambitious targets

  1. Loss aversion
    -To what extent are evaluators more concerned with avoiding loss than achieving gains?
    – How concerned are evaluators with being held responsible for a failed project?
    -To what extent has the organization specified acceptable risk levels?

Recommendation
-Seek risk tolerance guidelines from organizational leaders.

Decision-making tools like checklists can significantly reduce unconscious biases, provided that they are consistently and systematically applied.

-*What strategies have you found most helpful in reducing biases in decision-making?

Follow-share-like http://www.kathrynwelds.com and @kathrynwelds

Related Posts:

Twitter:  @kathrynwelds
Google+LinkedIn Groups Psychology in Human Resources (Organisational Psychology)
Facebook
Blog:  Kathryn Welds | Curated Research and Commentary

©Kathryn Welds