Guest Contributor – Carolyn Thomas
After my first non-fiction book was published, something unusual happened to me. Overnight, I suddenly became an “expert”. During book tours, I did newspaper, magazine, television and radio interviews. I was invited to speak at conferences and writers’ festivals and schools during National Book Week. Each presentation led to more invitations. I became the go-to writer for foreign Sunday paper editors who needed a West Coast perspective on my subject. I was even offered my own regular weekly broadcast gig on a popular TV newsmagazine show.
Why? Because I’d written a bestselling book. By the time my second book came out four years later, I was already a pretty well-known local “expert” on my subject matter. Even my publisher treated me with newfound respect (and a hefty advance).
Of course, I wasn’t really an expert at all. I had, as David H. Freedman describes it, “simply done a good job of gathering up a lot of information”. I never pretended to be an expert, but writing books and doing live presentations or media interviews sure seem to make other people believe I was.
Freedman himself did a radio interview with Phil Dobbie of the Australian edition of BNet News about his own book, called “Wrong: Why Experts Keep Failing Us – and How to Know When Not to Trust Them.”
His book reveals the following disturbing realities about those we really do consider to be true experts. For example:
- About two-thirds of the findings published in leading medical journals are refuted within a few years.
- As much as 90% of physicians’ medical knowledge has been found to be substantially or completely wrong.
- There is a one-in-12 chance that a doctor’s diagnosis will be so wrong that it causes the patient significant harm.
- Economists have found that all studies published in economics journals are likely to be wrong.
- Professionally prepared tax returns are more likely to contain errors than self-prepared returns.
- Half of all newspaper articles contain at least one factual error.
So why, then, Freedman wonders, do we blindly follow experts?
He suggests it’s because we live in a very complex world. We like simple answers. We don’t want to hear an expert go on and on about how things might be true for some people.
We want to hear “This drug is better than that drug!” or “Eat this, not that!” or “Invest your money here!” These statements, he claims, typify the kinds of experts who get published. And the media likes to further sensationalize what the experts say.
Editors of prestigious medical journal, for example, look for article submissions that look as if these experts are advancing science and giving us important answers.
But doesn’t the peer review process help to protect us from the potential errors of medical research experts? Not necessarily, says Freedman.
“Peer review is not designed to turn up fraud, or to expose experts who are fudging data, or to reveal author’s hidden biases. Peer review does not protect us.”
Freedman adds that unconscious personal bias is also an important problem in science.
“We have biases about what we hope is true. If we’re researching a pill to cure a certain medical condition, we have an unconscious bias towards reaching that simple probable answer that might solve our problems.”
This is also what’s known as confirmation bias. Confirmation bias refers to a type of selective thinking in which we tend to notice and to look for what confirms our beliefs. Not only that, but we then want to ignore, or undervalue, or not even look for the relevance of anything that contradicts those beliefs.
As a heart attack survivor who was misdiagnosed with acid reflux in the E.R. and sent home (despite presenting with textbook heart attack symptoms like crushing chest pain and pain radiating down my left arm), I’m no stranger to the damage caused by experts who are wrong.
Women under the age of 55 are in fact seven times more likely than men are to be misdiagnosed in mid-heart attack and sent home. About 5% of autopsies find clinically significant conditions that were missed and could have affected the patient’s survival. And over 40% of medical malpractice suits are for “failure to diagnose.”
A 2003 article* in the journal Academic Medicine describes the mistakes that medical experts make as “thinking errors”. They include:
- Anchoring bias – locking on to a diagnosis too early and failing to adjust to new information.
- Availability bias – thinking that a similar recent presentation is happening in the present situation.
- Confirmation bias – looking for evidence to support a pre-conceived opinion, rather than looking for information to prove oneself wrong.
- Diagnosis momentum – accepting a previous diagnosis without sufficient skepticism.
- Overconfidence bias – over-reliance on one’s own ability, intuition, and judgment.
- Premature closure – similar to confirmation bias but more “jumping to a conclusion”
- Search-satisfying bias – The “eureka” moment that stops all further thought.
Freedman adds that when errors are published in scientific journals by experts, it’s often the original erroneous finding that gets quoted and built upon in later papers. If and when a serious error/retraction is exposed, it’s often ignored.
Scientists are still our best bets, flaws notwithstanding, but sometimes the unskilled gurus out there get it right and the scientists get it wrong, warns Freedman.
“I wish I had a simple 5-step recipe, but I don’t. Let’s take information from as many sources as possible. Educate yourself. Get a good cross-section of opinions from credible sources. Use common sense. Be as informed as possible.”
And to fend off critics who may accuse him of being one of these “experts” himself, he told the Australian edition of BNet News:
“I’ve included an entire chapter in my book about all the ways I’ve been wrong. Don’t take my word for anything. Even though I’ve gathered up a lot of information and done the best I can do, I don’t ever claim to be an expert.”
Reproduced with permission from The Ethical Nag: marketing Ethics for the Easily Swayed by Carolyn Thomas – http://ethicalnag.org Carolyn has over 37 years experience in journalism marketing and public relations.She has a particular interest in medical research and Big Pharma marketing issues.
The original article appears here
Carolyn has over 37 years experience in journalism marketing and public relations who has a particular interest in medical research and Big Pharma marketing issues.