To Be A Psychiatrist Is To Be Uncertain: Cognitive Error and Kindness

“Hello babies. Welcome to Earth. It’s hot in the summer and cold in the winter. It’s round and wet and crowded. On the outside, babies, you’ve got a hundred years here. There’s only one rule that I know of, babies: God damn it, you’ve got to be kind.” ― Kurt Vonnegut

Jerome Groopman wrote a piece in 2007 in The New Yorker called ‘What’s The Trouble‘, which is one of the more influential texts in my life, and which I’m often sorry I ever read. 

If you want to remain free of anxiety about your potential failings as a doctor, or about the potential failings of your doctor, maybe don’t read ‘What’s The Trouble’. Read this! Or this.

Groopman, who is a haematologist, Harvard professor, and author of more than two hundred scientific papers as well as a New Yorker staffer, because he has more hours in his day than I have, introduced me in ‘What’s The Trouble?’ to the field of cognitive error in medicine.

This is the study of medical mistakes we make by the very act of thinking.

As Daniel Kahneman, cognitive psychology pioneer, put it, people think fast and slow.

From The Guardian

Slow thinking in medical decision-making involves accumulation of all necessary evidence, painstaking appraisal of that evidence, and demonstrably disinterested decision-making. This is what college trains us to do. Adherence to this model demands essentially unlimited time and resources and a degree of objectivity that no-one possesses.

Fast thinking is making decisions based on pattern recognition and gut instinct; this is what, as we mature clinically, we mostly do. Fast thinking is good. Efficiency is important; speed is not laziness.

Whenever we have limited time (which is always) and insufficient or ambiguous information (close to always) we rely, said Kahneman, on cognitive shortcuts, rules of thumb, called heuristics.

We are right to rely on heuristics, said Groopman. “Heuristics are indispensable in medicine; physicians, particularly in emergency rooms, must often make quick judgments about how to treat a patient, on the basis of a few, potentially serious symptoms.” And they are fine, until they’re not.

When cognitive shortcuts let us down cognitive error happens, and we risk mistakes. The list of cognitive errors, outlined by Dr Pat Croskerry in The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them, is long. Most cognitive errors are intuitive, if, by their very nature, irrational.

Anchoring is the tendency to rely too much on information obtained early in the decision-making process, even when it’s not the most pertinent piece of information, or when it’s later shown to be irrelevant. If in a phone call about a paranoid patient a colleague mentions a history of cocaine use, it’s going to be hard not to rank cocaine-induced psychosis high up the list of likely diagnoses, even if the patient insists he hasn’t had any cocaine in a year. (There’s a great example of anchoring involving Mahatma Gandhi, not involving cocaine, here.)

Confirmation bias is the tendency to seek out information that confirms your initial provisional diagnosis and to dismiss information that challenges that diagnosis. You should seek out information that challenges your initial diagnosis; if you’re right, you won’t be able to prove yourself wrong. In the previous example, one might if biased doubt the history of abstinence more than the diagnosis of drug-induced psychosis; a negative drug test would be read to mean the patient used cocaine long enough ago that it’s out of his system, not, as he insists, that he’s clean.

Commission bias is the decision to act (e.g. to treat with an antipsychotic) because you are a doctor and you are supposed to help, even if your actions are not all that helpful (e.g. you might not know exactly what you are treating, and antipsychotics are serious drugs). “Don’t just stand there—Do something!” applies here.

The sunk cost fallacy is the failure to back out of an initial diagnosis because of the feeling that you have that you have committed to this diagnosis—regardless of whether it is right. This must be bipolar II disorder because I said it was in a letter to a GP a few weeks ago.

I have honest to God caught myself thinking this way.

It looks so stupid when you write it down like that.

And there are cognitive errors that doctors make because we are influenced by our emotions much more than we like to admit—influenced, often, by the way we feel about our patients. We make bad decisions because we care about our patients too much (the affective error) and because, less comfortably, we care too little.

Making the affective error, we may act as if a clinical situation was how we wish it to be rather than how it is. If a patient looks like he has a psychosis, but we’re not 100% sure, and if receiving that diagnosis may have adverse consequences for the patient, we may hold off on making the diagnosis for longer than we should, because we don’t want to upset our patient. (Magical thinking is common in cognitive error.)

Conversely, if a patient is difficult or self-destructive, or engaging in behaviour that is not helping us to help him, we are prone to the fundamental attribution error: “the tendency to be judgmental and blame patients for their illnesses rather than examine the circumstances that might have been responsible”. Then, we need to be careful that we don’t let ourselves off the hook of helping that patient—he’s not helping himself, why should I?

In ‘What’s the Trouble’ and The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them, Jerome Groopman and Pat Croskerry gave some great advice.

They said, primarily, to use the patient. Don’t just trust yourself. Ask for hard questions from your patients and their families. Insist, politely, that they test you. Ask that they ask you why you’re doing what you’re doing – how do you know you’re making the right decision?

Put yourself through the effort of considering three or four diagnoses even when you are sure you know what the diagnosis is—you can be sure and still be wrong. Routinely ask the question: What else might this be?

On commission bias: Don’t just do something – stand there!

Croskerry also advised clinicians to develop insight into their own thinking style and be aware of the potential for error. I did this. I internalised Croskerry and wrote about cognitive error in psychiatry, and worried myself, and taught it to freaked-out trainees.

Link to BJPsych piece, 15 years old but not yet out of date, is here

I figured the likelihood in psychiatry of error related to complexity and ambiguity was high; to be a psychiatrist is to be uncertain. Psychiatric diagnostic systems depend on subjective assessments of ambiguous information about inner experience. How much anhedonia is enough anhedonia to count towards a diagnosis of depression? How bad should concentration be to be “poor concentration”? How firmly held must a belief be to count as a delusion, and so to justify consideration of an antipsychotic? The potential for bias is huge, as is, accordingly, the potential to be wrong.

So I’ve always found cognitive error compelling and unnerving.

I’m a busy clinician. I have the guts of 300 patients on my books at any one time. I make a lot of decisions; some without much time to make them; some with potentially serious consequences. All you have in your clinical practice is your ability to think through a decision. If you can’t trust your thinking, where are you? Knowing how wrong you can be is unmooring. You can easily run into crises of confidence when you are hyper-aware of your fallibility.

So how do we know we are making the right decisions?

We have to trust ourselves, up to a point, and trust those around us. We have to ask our teams how we are doing, and I do this to a degree that I worry might annoy my team. (“Oh Jesus get on with it.”) The occupational therapists, social worker, psychologists, and nurses on my team are well able to tell me when they think I’m wrong. I rely hugely on that.

I ask my trainees to tell me what they think of the plan I just put together. Some look perplexed – why are you asking me? – and some get it. Consultants have the final call—we are the people on the team who need to be OK with every decision anyone makes—but we need help. In avoiding overconfidence you have to have the confidence to ask for honest feedback, and hope that it is forthcoming.

And you rely on your patients as much as you rely on your colleagues. You have to trust your patients and their families to tell you what they really think and to co-design their care with you. My favourite words to write in a chart (after “Mood subjectively grand”) are “We agreed the following”. There’s something so reassuring about that.

The more I go on the more I’ve realised that external assurances aren’t enough. You need a solid underlying principle if you want to be confident that you are doing the right thing by your patients. You need bedrock. This probably seems obvious. But I don’t remember being trained to have such a principle in college, or in basic or senior training.

And it’s back to Kurt Vonnegut, and I think it’s kindness. Kindness is not talked about too much in medicine or psychiatry right now. It’s a little paternalistic and probably embarrassing. But kindness, I think, anchors you. How far wrong can you really go? If you are a clinician focused on being kind you are not focused on your own ego. You won’t make the mistakes we make for not allowing ourselves to be wrong. You were wrong. So what? Get over yourself. As Neal Maskrey wrote in the BMJ in 2013 in a beautiful blog postThe Importance of Kindness, “it isn’t about you… it’s about them“.

Aspring to kindness also sounds self-righteous. But you can aspire to be kind and know that you won’t, obviously, manage it. Maskrey quoted the late American novelist David Foster Wallace: “It’s hard, it takes will and mental effort, and if you’re like me some days you won’t be able to do it, or you just flat-out won’t want to.”

A practical approach could work as follows. If you’re practising medicine, or psychiatry, and you feel like being kind or compassionate, go for it. It’s easy when you’re feeling it. And if you’re practising medicine, or psychiatry, and you don’t feel compassionate – do it anyway. Behave as if you were feeling compassionate. Make the effort. You know what you’d do if you were feeling compassionate – so do it. Wallace knew that kindness doesn’t always come easy, but, still, he said, “Most days, if you give yourself a choice, you can choose to look differently“. Maskrey replied: “I’m trying David, every day I’m trying“.

One response to “To Be A Psychiatrist Is To Be Uncertain: Cognitive Error and Kindness”

  1. Brendan Kelly Avatar
    Brendan Kelly

    Great post, Niall. ‘So how do we know we are making the right decisions?’ I suppose we never know if many decisions are right or wrong (as we do not know alternative outcomes), and so we focus on process: Was the process leading to the decision good? Did I neglect certain information or opinions? A functioning team is the best assurance of all: a variety opinions expressed, discussed, and then a decision taken. And patients generally know best! With regard to the biases, the key issue is surely: Is awareness enough? If I am aware of the biases, does that, in itself, lead me to correct them? Or do I over-compensate? Over-think to the point of paralysis? Would educational interventions help? There is research work to be done on all of this, but a lucid understanding of specific biases (such as you provide) is certainly the bedrock. Thanks.

Leave a Reply

Discover more from Psychiatry and Songs

Subscribe now to keep reading and get access to the full archive.

Continue reading