Don't Know Much

Octavian Report: Can you describe the effect that bears your name, how it works, and what it means in terms of the decisions people make in everyday life?

David Dunning: It comes from work I did with Justin Kruger, who's now at New York University. It's a family of effects, but the one that everybody focuses on is the idea that people who are incompetent, who don't know very much also don't recognize the gaps or the glitches in their expertise. And if you think about it logically, the reason for this becomes quite clear: to know the limits of your knowledge you would have to have the knowledge in the first place. If you aren't very knowledgeable, if you don't have much expertise, you lack the expertise you need to recognize that deficiency.

OR: What has that global response — and within that, the U.S. response — looked like to you through the lenses both of this effect and through the broader themes of your research?

Dunning: Well, the thing to recognize about COVID is that the world has experienced pandemics before, but not this specific pandemic. We're in, in many ways, a very new situation. It's exactly when you're in a very new situation that there are a lot of unknown unknowns.

In terms of our response, there are two different levels to think of. There's the professional, the expert, the governmental/institutional level. And people have been caught a little unprepared in what's going on, so that has been going on. But also people in their everyday lives, in their household lives, have been caught somewhat flat-footed not only in knowing how to deal with this pandemic specifically, but how do you deal with the situation in which we all are low performers. We all are more than slightly ignorant of what's going on, of what we should do, what we should ask, whom we should pay attention to. We're learning to listen now — exactly when we're taking the test. Some of us might be learning the lesson only after we take the test.

OR: Are there specific things that you've seen in the news about policy processes that you would tease out as being examples of this: learning the lesson while taking the test?

Dunning: I think one of the problems is that a lot of people are giving all their exam answers at the same time. We, as a public, have become quite skilled now, and in a hurry, about whose answers do we listen to. I mean, when you talk about policy pronouncements, when you talk about predictions of the future — what you should be doing, is there a drug you should take, how long is it going to be before there's a vaccine — there's a lot of people giving a lot of different answers. Some of them are quite expert. Some of them are people we should be listening to. Other policy recommendations are coming not from policy makers but from, for lack of a better term, amateur epidemiologists who are using whatever corner of knowledge they have and applying it to what they think is going on with COVID and how they think people should respond.

There's a lot of information floating around. Some of it is deliberate misinformation. It's fraudulent. A lot of it is well-intentioned but misguided. But in there there is some truth. There is some policy that we should be listening to. I think a core issue is for each of us individually in our own homes to try to identify who are the voices we should be listening to, and who are the voices we should think are interesting but we're going to check on that, and also making sure that those voices that are expert somehow become noisier than a lot of the other noises out there.

OR: We live in perhaps the most information-saturated era in human history. Is this a moment at which we might rethink our relationship to information?

Dunning: The answer, of course, will be yes and no. There will be some rethinking. How much does it permeate the policy realm, how much does it permeate into the public, and how much of it becomes part of public education are extremely interesting questions.

The thing that makes the future uncertain is who exactly commits that rethinking, and how long does it last. Because part of the problem we suffer is that each generation often has to relearn lessons that have been learned by previous generations. If only we could teleport some of our great-grandparents from 1920 to the present and have them talk about the Spanish Flu, there would be precious lessons we could all learn there.

For my purposes, I think there is one thing that is certain for the future, and that is that we are awash in information and data in ways that we haven't been before. So the one thing that I always keep in the front of my brain is this idea that data is not information, information is not knowledge, knowledge is not understanding, and understanding is not wisdom. You can have a lot of information but you have to really think hard about it. You have to think it through so that you can achieve understanding and maybe a little bit of wisdom.

There's one thing that I would point out here. It's very easy, especially in these times, to latch onto whatever message that you get and not think about the credibility of its source. There's a lot of work in psychology showing that we pay a lot of attention to the strength of a message but we give short shrift to questions about the credibility of its source. I think we would do better if we thought more about that second question of credibility — and also if we were a little bit more expert in how to judge credibility.

One of the things that I would suggest is that how you judge credibility is by cross-referencing. This one person or this one source is saying this. What are other sources saying? Is there a consensus or not? Is this something that I hear from independent sources that are pretty much saying the same thing, or is this just some random, brand new, isolated piece of information? What we want to do there is, we want to check on it and check its credibility, its provenance, for example.