ECT and “unwilling” in Ireland: the principle and practice

On 8th December, the word “unwilling” was deleted from section 59 of Ireland’s Mental Health Act.

Section 59 of the Act deals with administration of electro-convulsive therapy ECT without informed consent to involuntary patients in approved centres (psychiatric units). The amendments mandating this change and an analogous change to section 60, which governs the administration of medication without consent to involuntary patients, are here.

The Mental Health Act up to 8th December allowed for administration of ECT to people who were “unable” or “unwilling” to consent, or both. The decision as to who whether a patient was “unable” or “unwilling” was formally the decision of the treating consultant psychiatrist along with one other independent consultant psychiatrist. (“Unwilling” is now gone, but the decision around inability to consent remains with two consultants.)

The removal of “unwilling” from the Act was long-flagged. The College of Psychiatrists of Ireland have been advocating for the change since 2010. (The College has only existed since 2009).

The change to section 59 is an important change, in principle. In practice, I would argue, it will make very little difference.

To be “unwilling” to consent, under the Act, implied that the person had capacity to give informed consent – which means that he or she was able to refuse informed consent. A person who received ECT or medication without consent but was documented to be “unwilling” to consent rather than “unable” was therefore considered capable of giving or refusing consent. It never made sense that this provision would be in the Act and I find it hard to imagine a set of circumstances in which I would give a treatment to a patient who refused it, if they were able to weigh up the pros and cons of that treatment.

To be “unable” to consent to a treatment – ECT, medication, or any other – means to lack capacity, which is “the ability to use and understand information to make a decision, and communicate any decision made.”

Capacity is judged on a decision by decision basis so that one might have capacity to consent to one treatment and not another. Ireland’s capacity legislation dates from 1871, but is almost updated – a Capacity Bill from 2013 is nearing the end of its legislative journey.

A person who is “unable” to consent to ECT – “unable” has stayed in the Act, by necessity – is, generally, too sick to make treatment decisions. I’ve written about illness depriving people of capacity here.

The only way that the loss of the word “unwilling” in Section 59 would make a substantial difference in practice is if, in recent years, the number of apparently capacitous people being treated with ECT without consent was large. It was small. (It was not zero.)

The most recent figures on ECT from the Mental Health Commission, which has oversight of all activity in approved centres, are from 2013. They were published on November 12th 2015 and widely reported.

The MHC report records each course of ECT without consent and the assessment of each psychiatrist, documented in a tick box on the MHC’s Form 16, as to whether the patient was unable or unwilling to consent. (The forms are yet to be updated.)

In 2013, 46 people received ECT without consent. In 39 cases, both psychiatrists ticked “unwilling”. In six cases, one psychiatrist ticked “unable” and another ticked “unwilling”. In one case in 2013, both psychiatrists ticked “unwilling”. The analogous numbers for 2012 were 26, 3, and 1; for 2011 20, 4, and 3; for 2010 29, 6, and 5; and for 2009 38, 5, and 11. (I hadn’t gone back as far as the 2009 figure before now and it’s higher than I would have thought it would be.)

What these figures show is that the number of people receiving ECT without consent went down, then back up, in the five cited years, and the number of people receiving ECT under “unwilling” – i.e. those who the Form 16s would indicated had preserved capacity, but had ECT without consent anyway – dramatically reduced, from 11 in a year to one in a year; we don’t have figures for 2014 or 2015 so we don’t yet have a baseline to judge any potential change in the figures to ensue from the legislative change last night.

There is a question here: why did anyone receive this treatment without consent when they were deemed, one would infer, to have capacity to decline the treatment? I just don’t know. It’s a basic principle not just of psychiatry but of all medicine that if you have capacity, and you say no, your no is respected. It’s how we work day in day out. There’s no reason to insist on a treatment that a capacitous person is refusing – not just in principle, but in practice. Psychiatrists are used to thinking that a treatment might help, and to being told no.

I have my suspicion, which is that psychiatrists ticked “unwilling” to mean both unable and unwilling – as in, we wouldn’t even be looking at this form if the patient were “able” to consent. I doubt anyone who received ECT without consent in Ireland in the last several years had capacity to consent. But I don’t know for sure, and I’m not sure we’ll ever know. From now on, we’ll know.

People are going to need treatment with ECT without consent as long as severe depression exists. Those people will receive that treatment without consent because they are unable to consent – they may think all hope is lost when it’s not; they may think we are trying to kill them; they may think they are already dead. Small numbers of people will need this treatment. It is a very good thing that “unwilling” is gone. The number of people receiving treatment with ECT under “unwilling” is now, as it should be, zero. The number of people we have to treat who are too sick to consent will never be.


To Be A Psychiatrist Is To Be Uncertain: Cognitive Error and Kindness

“Hello babies. Welcome to Earth. It’s hot in the summer and cold in the winter. It’s round and wet and crowded. On the outside, babies, you’ve got a hundred years here. There’s only one rule that I know of, babies: God damn it, you’ve got to be kind.” ― Kurt Vonnegut

Jerome Groopman wrote a piece in 2007 in The New Yorker called “What’s The Trouble“, which is one of the more influential texts in my life, and which I’m often sorry I ever read. If you want to remain free of anxiety about your potential failings as a doctor, or about the potential failings of your doctor, maybe don’t read ‘What’s The Trouble’. Read this! Or this.

Groopman, who is a haematologist, Harvard professor, and author of more than two hundred scientific papers as well as a New Yorker staffer, because he has more hours in his day than I have, introduced me in “What’s The Trouble?” to the field of cognitive error in medicine.

This is the study of medical mistakes we make by the very act of thinking.

As Daniel Kahneman, cognitive psychology pioneer, put it, people think fast and slow.

Slow thinking in medical decision-making involves accumulation of all necessary evidence, painstaking appraisal of that evidence, and demonstrably disinterested decision-making. This is what college trains us to do. Adherence to this model demands essentially unlimited time and resources and a degree of objectivity that no-one possesses.

Fast thinking is making decisions based on pattern recognition and gut instinct; this is what, as we mature clinically, we mostly do. Fast thinking is good. Efficiency is important; speed is not laziness.

Whenever we have limited time (which is always) and insufficient or ambiguous information (close to always) we rely, said Kahneman, on cognitive shortcuts, rules of thumb, called heuristics.

We are right to rely on heuristics, said Groopman. “Heuristics are indispensable in medicine; physicians, particularly in emergency rooms, must often make quick judgments about how to treat a patient, on the basis of a few, potentially serious symptoms.” And they are fine, until they’re not.

When cognitive shortcuts let us down cognitive error happens, and we risk mistakes. The list of cognitive errors, outlined by Dr Pat Croskerry in The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them, is long. Most cognitive errors are intuitive, if, by their very nature, irrational.

Anchoring is the tendency to rely too much on information obtained early in the decision-making process, even when it’s not the most pertinent piece of information, or when it’s later shown to be irrelevant. If in a phone call about a paranoid patient a colleague mentions a history of cocaine use, it’s going to be hard not to rank cocaine-induced psychosis high up the list of likely diagnoses, even if the patient insists he hasn’t had any cocaine in a year.

(There’s a great example of anchoring involving Mahatma Gandhi, not involving cocaine, here.)

Confirmation bias is the tendency to seek out information that confirms your initial provisional diagnosis and to dismiss information that challenges that diagnosis. You should seek out information that challenges your initial diagnosis; if you’re right, you won’t be able to prove yourself wrong. In the previous example, one might if biased doubt the history of abstinence more than the diagnosis of drug-induced psychosis; a negative drug test would be read to mean the patient used cocaine long enough ago that it’s out of his system, not, as he insists, that he’s clean.

Commission bias is the decision to act (e.g. to treat with an antipsychotic) because you are a doctor and you are supposed to help, even if your actions are not all that helpful (e.g. you might not know exactly what you are treating, and antipsychotics are serious drugs). “Don’t just stand there – Do something!” applies here.

The sunk cost fallacy is the failure to back out of an initial diagnosis because of the feeling that you have that you have committed to this diagnosis – regardless of whether it is right. This must be bipolar II disorder because I said it was in a letter to a GP a few weeks ago.

I have honest to God caught myself thinking this way.

It looks so stupid when you write it down like that.

And there are cognitive errors that doctors make because we are influenced by our emotions much more than we like to admit – influenced, often, by the way we feel about our patients. We make bad decisions because we care about our patients too much (the affective error) and because, less comfortably, we care too little.

Making the affective error, we may act as if a clinical situation was how we wish it to be rather than how it is. If a patient looks like he has a psychosis, but we’re not 100% sure, and if receiving that diagnosis may have adverse consequences for the patient, we may hold off on making the diagnosis for longer than we should, because we don’t want to upset our patient. (Magical thinking is common in cognitive error.)

Conversely, if a patient is difficult or self-destructive, or engaging in behaviour that is not helping us to help him, we are prone to the fundamental attribution error: “the tendency to be judgmental and blame patients for their illnesses rather than examine the circumstances that might have been responsible”. Then, we need to be careful that we don’t let ourselves off the hook of helping that patient – he’s not helping himself, why should I?

In ‘What’s the Trouble’ and The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them, Jerome Groopman and Pat Croskerry gave some great advice.

They said, primarily, to use the patient. Don’t just trust yourself. Ask for hard questions from your patients and their families. Insist, politely, that they test you. Ask that they ask you why you’re doing what you’re doing – how do you know you’re making the right decision?

Put yourself through the effort of considering three or four diagnoses even when you are sure you know what the diagnosis is – you can be sure and still be wrong. Routinely ask the question: What else might this be?

On commission bias: Don’t just do something – stand there!

Croskerry also advised clinicians to develop insight into their own thinking style and be aware of the potential for error. I did this – I internalised Croskerry and wrote about cognitive error in psychiatry, and worried myself, and taught it to freaked-out trainees.

I figured the likelihood in psychiatry of error related to complexity and ambiguity was high; to be a psychiatrist is to be uncertain. Psychiatric diagnostic systems depend on subjective assessments of ambiguous information about inner experience. How much anhedonia is enough anhedonia to count towards a diagnosis of depression? How bad should concentration be to be “poor concentration”? How firmly held must a belief be to count as a delusion, and so to justify consideration of an antipsychotic? The potential for bias is huge, as is, accordingly, the potential to be wrong.

So I’ve always found cognitive error compelling and deeply unnerving.

I’m a busy clinician. I have the guts of 300 patients on my books at any one time. I make a lot of decisions; some without much time to make them; some with potentially serious consequences. All you have in your clinical practice is your ability to think through a decision. If you can’t trust your thinking, where are you? Knowing how wrong you can be is unmooring. You can easily run into crises of confidence when you are hyper-aware of your fallibility.

So how do we know we are making the right decisions?

We have to trust ourselves, up to a point, and trust those around us. We have to ask our teams how we are doing, and I do this to a degree that I worry might annoy my team. (“Oh Jesus get on with it.”) The occupational therapists, social worker, psychologist, and nurses on my team are well able to tell me when they think I’m wrong. I rely hugely on that.

I ask my trainees to tell me what they think of the plan I just put together. Some look perplexed – why are you asking me? – and some get it. Consultants have the final call – we are the people on the team who need to be OK with every decision anyone makes – but we need help. In avoiding overconfidence you have to have the confidence to ask for honest feedback, and hope that it is forthcoming.

And you rely on your patients as much as you rely on your colleagues. You have to trust your patients and their families to tell you what they really think and to co-design their care with you. My favourite words to write in a chart (after “Mood subjectively ‘grand'”) are “We agreed the following”. There’s something so reassuring about that.

The more I go on the more I’ve realised that external assurances aren’t enough. You need a solid underlying principle if you want to be confident that you are doing the right thing by your patients. You need bedrock. This probably seems obvious. But I don’t remember being trained to have such a principle in college, or in basic or senior training.

And it’s back to Kurt Vonnegut, and I think it’s kindness.

Kindness is not talked about too much in medicine or psychiatry right now. It’s a little paternalistic and probably embarrassing. But kindness, I think, anchors you. How far wrong can you really go? If you are a clinician focused on being kind you are not focused on your own ego. You won’t make the mistakes we make for not allowing ourselves to be wrong. You were wrong. So what? Get over yourself. As Neal Maskrey wrote in the BMJ in 2013 in a beautiful blog postThe Importance of Kindness, “it isn’t about you… it’s about them”.

Aspring to kindness also sounds self-righteous. But you can aspire to be kind and know that you won’t, obviously, manage it. Maskrey quoted the late American novelist David Foster Wallace: “It’s hard, it takes will and mental effort, and if you’re like me some days you won’t be able to do it, or you just flat-out won’t want to.”

A practical approach could work as follows. If you’re practising medicine, or psychiatry, and you feel like being kind or compassionate, go for it. It’s easy when you’re feeling it. And if you’re practising medicine, or psychiatry, and you don’t feel compassionate – do it anyway. Behave as if you were feeling compassionate. Make the effort. You know what you’d do if you were feeling compassionate – so do it.

David Foster Wallace knew that kindness doesn’t always come easy, but, still, he said, “Most days, if you give yourself a choice, you can choose to look differently”. Maskrey replied: “I’m trying David, every day I’m trying”.