"How to think straight: Cognitive de-biasing by Pat Croskerry
The number of preventable deaths of hospitalized patients in the US each year is estimated at 40,000- 80,000. The figure for the ICU alone is estimated at 40,000 so the death rate must be in the higher end of the range. When settings outside the hospital are taken into account (ED, primary care), the overall number must be considerably higher.
While many factors contribute to diagnostic failure, a variety of sources suggest that physician’s thinking has a lot to do with it. Dual Process Theory describes how the brain makes decisions in one of two modes: through fast, unconscious, intuitive processes (System 1) or through slower, conscious, analytical processes (System 2). Mental short-cuts (heuristics) and biases are predominantly located in the intuitive mode where we spend most of our conscious time, and this is where the majority of decision failures occur. Thinking straight essentially means achieving a good balance between System 1 and System 2 decision making, and much of our cognitive effort needs to go into monitoring what our unconscious brains are doing in System 1. This is referred to by a variety of terms: metacognition, reflection, mindfulness, and others. They all involve cognitive de-coupling from System 1 and characterize the process of cognitive de-biasing. This is not easily accomplished in the ED or any environment where decision density is often high, throughput pressure exists, resources may be limited, and where decision makers may be fatigued and/or sleep deprived.
While medicine has acquired a variety of strategies over the years for de-biasing clinicians, added benefits can be obtained by developing specific mindware to tackle particular biases. Clinicians need to be aware of the operating characteristics of the dual process model of decision making, of the prevalence and nature of biases, and of how to apply and sustain de-biasing mindware in their decision making.
"
Call Girls Siliguri Just Call 8250077686 Top Class Call Girl Service Available
How to Think Straight- Cognitive Debiasing Pat Croskerry
1. How to think straight:How to think straight:
Cognitive debiasingCognitive debiasing
Pat Croskerry MD, PhDPat Croskerry MD, PhD
SMACC Chicago June 23-26 2015SMACC Chicago June 23-26 2015
5. Diagnostic errors in the ICUDiagnostic errors in the ICU
Winters et al,Winters et al, BMJ Q&S 2012BMJ Q&S 2012
• Review of 31 studies over 35 year period (1966-2011)
• 5863 autopsies
• 28% had at least one missed diagnosis
• 8% caused or significantly contributed to death
• 4 diagnoses accounted for 75% of all misses
(MI, PE, Pneumonia, Aspergillosis)
• 40,000 ICU patients in the US die annually from misdiagnosis
6. Mostly, it’s not what we don’t know,Mostly, it’s not what we don’t know,
it’s how we think.it’s how we think.
We need to know more about howWe need to know more about how
we think…we think…
8. ‘‘Cognitive thought is the tip ofCognitive thought is the tip of
an enormous iceberg. It is thean enormous iceberg. It is the
rule of thumb among cognitiverule of thumb among cognitive
scientists that unconsciousscientists that unconscious
thought is 95% of all thought –thought is 95% of all thought –
this 95% below the surface ofthis 95% below the surface of
conscious awareness shapesconscious awareness shapes
and structures all consciousand structures all conscious
thought’thought’
Lakoff and Johnson, 19Lakoff and Johnson, 199999
9. Medical IntuitionsMedical Intuitions
FastFast
CompellingCompelling
FrequentFrequent
Minimal cognitive effort requiredMinimal cognitive effort required
AddictiveAddictive
Mostly serve us wellMostly serve us well
Occasionally catastrophicOccasionally catastrophic
10. Analytic failures
Sound reasoning and problem solving may be
based on incorrect assumptions
Overarching biases may influence the
reasoning process
Incomplete information
11. Seshia et al, JECP 2014
Major
Organizations
that Influence the
Quality of
Evidence
Informing
Healthcare
13. The ED is a difficult decision makingThe ED is a difficult decision making
environmentenvironment
High decision density
Decision fatigue
Throughput pressures
Wide range of illnesses
Diagnostic uncertainty
Narrow time windows
Interruptions and distractions
Shift work/Sleep disruption
Shift changes
14. Diagnostic error in the ED
Radiology: 5%
Missed injuries: 12%
Cardiovascular 19%
Respiratory: 30%
Overall ~16%
16. 88 Main Features of the ModelMain Features of the Model
• We spend most of our time in System 1We spend most of our time in System 1
• Most heuristics and biases are in System 1Most heuristics and biases are in System 1
• Most errors occur in System 1Most errors occur in System 1
• Repetitive operations of System 2 >>> 1Repetitive operations of System 2 >>> 1
• System 2 override of System 1System 2 override of System 1
• System 1 override of System 2System 1 override of System 2
• Toggle functionToggle function
• Cognitive Miser function (being comfortably numb)Cognitive Miser function (being comfortably numb)
17. When does the Cognitive Miser kick in?When does the Cognitive Miser kick in?
• All the time
• Especially when the decision maker is fatigued
• Especially when sleep deprived
• Especially when feeling down/depressed
• Especially when resources are limited
• During overcrowding/limited resources
18. So how do we becomeSo how do we become
better decision makers?better decision makers?
23. What decision making needsWhat decision making needs
• Increased awareness of importance of decision makingIncreased awareness of importance of decision making
• Know operating characteristics of DPT modelKnow operating characteristics of DPT model
• Teach the main cognitive and affective biasesTeach the main cognitive and affective biases
• Promote metacognition, mindfulness, reflectionPromote metacognition, mindfulness, reflection
• Promote critical thinkingPromote critical thinking
• Raise awareness of conditions which may compromise decisionRaise awareness of conditions which may compromise decision
making (fatigue, sleep deprivation, cognitive overload)making (fatigue, sleep deprivation, cognitive overload)
• Teach cognitive debiasingTeach cognitive debiasing
28. We need toWe need to
maintainmaintain
a feral vigilancea feral vigilance
to detect biaseto detect biases
29. It ain’t easyIt ain’t easy
Even though bias detectedEven though bias detected
Very unlikely one strategy works for allVery unlikely one strategy works for all
Need for multiple approachesNeed for multiple approaches
Very unlikely one shot will workVery unlikely one shot will work
Need for multiple innoculationsNeed for multiple innoculations
Need for extra vigilance in critical conditionsNeed for extra vigilance in critical conditions
Need for lifelong maintenanceNeed for lifelong maintenance
30. The emerging cognitive debiasingThe emerging cognitive debiasing
literature in medicineliterature in medicine