Categories
Bullshit Health Hell in a handcart

Overconfidence and incompetence

Something we’re seeing a lot of during the coronavirus crisis is the rise of the armchair epidemiologist: the men (it’s mainly men) presenting themselves as authoritative voices about things they have no expertise in.

Sarah Weinman, for InsideHook.com:

They are lawyers, former reporters and thriller writers, Silicon Valley technologists, newspaper columnists, economists and doctors who specialize in different parts of medicine. Their utter belief in their own cognitive abilities gives them the false sense that their speculation, and predictive powers, are more informed than the rest of ours.

They’ve been with us for a long time, of course – the blogging world is full of them – but coronavirus has given some of them a much bigger audience, and that has made some of them dangerous. The UK press and social media is full of grifters speaking with great certainty about things they know nothing about, and those things currently include how to deal with a lethal global pandemic.

There is a name for this, and it is the Dunning-Kreuger effect. The effect is often explained as “stupid people are too stupid to know they are stupid”, but it’s more nuanced than that. It’s not that people are stupid. Many of the people who clearly have DK are very clever. It’s that they are blinkered: they lack the knowledge to understand what knowledge they are lacking.

For example, let’s say you’re an economist. If you turn your attention to the likely outcome of the coronavirus, you may come up with different answers than the virologists and epidemiologists do. That doesn’t necessarily mean the virologists and epidemiologists are wrong; it’s much more likely that you’re making ignorant assumptions and rookie mistakes that people in the field don’t make. You don’t know that you’re making them, because this isn’t your area of expertise.

Where the Dunning-Kreuger effect comes into play is when you decide that if the experts disagree with you, it means it is the experts who are wrong.

Who better to speak to about the Dunning-Krueger effect than David Dunning, one of the two professors who coined the term? That’s who Sarah Weinman interviewed.

The problem is that some people can take things they know and misapply it to this new situation. A lot of people think, “Oh, this is a flu,” so they use what is common knowledge of the flu to guide them. But this virus is not the flu. Knowledge is a good thing, but they don’t realize it’s a misapplication.

I used the example of an economist because that’s a field Dunning specifically mentioned.

Confidence comes from knowing something, but not realizing you don’t know everything you need to know. If you’ve been rewarded as a successful economist, you deal with formal models in math, and you have confidence in what you do. This can be true of all of us in our area of expertise.

That confidence may be perfectly justified in economics, but that doesn’t necessarily mean that you have anything valuable to say in other fields.

Elon Musk is a great example of this. The Tesla boss has an electric car company and launches rockets into space. And when a bunch of kids got stuck in a cave in Thailand, Musk rode to the rescue with a special high-tech submarine to save them.

The submarine was useless, because it wasn’t able to navigate the caves. When criticised, Musk called an expert diver – the diver who actually helped rescue the trapped kids – a “pedo”.

Musk has since moved into providing ventilators for coronavirus patients. The machines he supplied are not ventilators. It’s surely just a matter of time before he calls the doctors “pedos” too.

Here’s one example of why these overconfident men are dangerous: Richard Epstein. Epstein has arguably contributed to the US death toll: his prediction that the coronavirus would only kill 500 Americans was widely shared in US conservative circles and helped inform US government policy on how to respond to the potential loss of life.

As NY Mag reports:

A week later, Epstein conceded that he had committed a math error, and the real number would be 5,000 deaths, though “it, too, could prove somewhat optimistic.”

At the time of writing, the US toll is about to pass 50,000 deaths.

…Somehow this experience has not shaken Epstein’s confidence in his own ability to outthink the entire field of epidemiology.

There’s an astonishing interview with Epstein in The New Yorker where he throws a tantrum.

O.K. I’m going to tell you. I think the fact that I am not a great scholar on this and I’m able to find these flaws or these holes in what you wrote is a sign that maybe you should’ve thought harder before writing it.

What it shows is that you are a complete intellectual amateur. Period.

O.K. Can I ask you one more question?

You just don’t know anything about anything. You’re a journalist. Would you like to compare your résumé to mine?

Part of the reason grifters have achieved such prominence is because the people in authority often have the Dunning-Krueger effect too.

The UK government is a stellar example, but you can also see it in things such as authorities urging us not to wear masks because they don’t really prevent you from getting the virus (even though proper ones do, which is why health workers use them, and though they do have a proven effect of reducing the danger of you spreading the virus to others if you don’t realise you have it). When official sources are often wrong, it creates a vacuum that grifters are all too ready to fill with bullshit.

In the MetaFilter discussion of the article, one commenter posted:

Science and these various “experts in stuff” both operate in uncertain environments, but treat uncertainty in totally opposite ways.

…Experts in stuff… use uncertainty as a means to an end, so they generally try to increase it. Since science shows its cards with regards to uncertainty, they can always argue a reasonable level of skepticism of science. Then they can turn around and present some alternative facts and arguments about their own position on the matter. The idea isn’t about the next researcher, or a process to eliminate uncertainty, it’s simply to be convincing. They don’t care if they are right – only if they are perceived as right.

This is why these “experts” can be so troubling to deal with. They’ll stake a claim against anything, as long as it gets them to their goal. Sometimes it’s just to be respected, but sometimes it can be much darker.