A recent paper in the Medical Journal of Australia (here) provides a nice overview of the biases that lead doctors to overtreat and overinvestigate, but also offers useful solutions that we need to act on.
A cognitive bias is where existing beliefs persist in the face of opposing evidence, leading to an illogical conclusion and action. In other words, what you already believe trumps any new knowledge that challenges that belief, and leads you to believe confirmatory evidence and (unjustifiably) discard evidence that contradicts your beliefs.
Cognitive biases can act like heuristics (rules of thumb) in that they can save us time by short cutting through the difficult process of critical evaluation of our decision making and any new evidence. Unfortunately, they result in wrong thinking and wrong actions.
The types of cognitive biases listed in this study (below) are interesting and you can see how these work.
- Commission bias. The regret from harms resulting from providing an unnecessary service (commission) are less than the regrets from failing to provide treatment (omission) that might have worked, even if the chance was low.
- Attribution bias (illusion of control). Improvements seen after treatment (or even cases where patients do not return) are assumed to be benefits caused by the treatments provided (when the patient may have improved without treatment).
- Impact bias. Doctors’ overestimation of effectiveness and underestimation of harms.
- Availability bias. The strong influence of the memory dramatic cases in decision making.
- Uncertainty bias. When in doubt, it often seems better to provide treatment than not to provide treatment.
- Representativeness bias. Benefits seen in one group of patients makes doctors believe that it is likely to be effective in other groups (indication creep).
- Sunken cost bias. The time, effort, resources and education put into a specific treatment leads doctors to keep using, hoping that with further input, it will be shown to be effective.
- Groupthink. The affirmation received by knowing that everyone else is doing the same thing.
How to mitigate these biases is another issue, but one addressed in the same study. Their suggestions are summarised as follows:
- Challenge doctors to ‘think about their thinking’. Education about biases and how to overcome them.
- Telling a story. Using case discussions at clinical meetings (looking at decision making, considering alternatives), or highlighting specific cases (in reports) makes the treatment errors more memorable and more likely to influence practice (using availability bias).
- Exposing clinicians to information about the high and low value treatments (education).
- Shared decision making. Often, giving patients clear information about relative risks and benefits leads to different decisions than if the doctor is the sole decision maker.
- Decision support for clinicians. Published guidelines and recommendations can support clinicians who want to do the right thing but feel it is easier or expected of them to continue ineffective treatments.
The bottom line:
Overtreatment and overdiagnosis is a problem in medicine and a major obstacle to correcting these if the cognitive biases held by clinicians. Recognition and tools to overcome these biases are important in tackling these problems and making medicine more efficient and effective.
I am a surgeon with an interest in evidence based medicine: the science behind medicine. I am interested in finding the true risks and benefits of interventions, and how this often differs from the perceived risks and benefits, as seen by the public, the media, and the doctors themselves.