The COVID-19 pandemic has shown us with alarming prevalence, the existence of people who simply refuse to listen to expert advice.
When it comes to large groups who refuse to wear masks, or the potential for many people to refuse a vaccine – there are real and often dangerous consequences to grapple with.
How do we combat this phenomenon?
University of Waterloo psychology researcher Ethan Meyers talks about one method that might work.
Why do people refuse to follow expert advice?
This may reflect people’s tendency to be overconfident. As people tend to think they know more than they really do, they may fail to credit experts with possessing any specialized knowledge about a topic. For instance, if I (falsely) believe that I possess extensive knowledge about how COVID-19 spreads, I might not think that scientists or public health authorities know anything that I do not. If this is the case, then it might make sense why I wouldn’t listen to the experts.
How do you get people to start listening? Is it possible?
It’s possible! Our study shows that when the illusion of knowledge is exposed (compared to when the illusion is not exposed), people revise their beliefs more to the opinion of experts than to the opinion of random members of the public.
Exposing the illusion of knowledge presumably made participants recognize the limits of their knowledge by themselves. That is, we didn't explicitly point out to them that they lacked extensive knowledge, the participants realized it on their own after we asked them to generate an explanation for how something worked (trying to explain something makes us aware of the gaps in our own knowledge).
Does this method work for every topic?
We were a bit surprised by the generality of the effect. We found that people didn't just have to fail to explain how a particular issue worked for the belief revision effect to occur. Instead, people "listened" to the experts even after failing to explain completely unrelated topics.
For example, in one of our experiments, we found that people started privileging the opinion of economists (over the public) on economic issues after failing to explain how a helicopter takes flight. A topic that has nothing to do with economics.
This suggests that recognizing that we do not know as much as we thought we did might induce a general feeling of intellectual humility – or ignorance – that leaves us more receptive to information from more valid sources (i.e., experts) over less valid ones (i.e., random members of the public).