Jonathan Fugelsang

Jonathan Fugelsang  
Professor, Faculty of Arts

In an era of rapidly spread information, distinguishing between what’s meaningful and profound and what’s misleading nonsense has never been more vital to a productive society.

Dr. Shane Littrell (PhD ’21) and Dr. Jonathan Fugelsang, a professor in the Department of Psychology, have embarked on a journey to unravel the complex intricacies of the human mind — especially when it comes to detecting and understanding misinformation, or as they aptly put it, B.S.

Their work establishes the concept of a “B.S. blind spot” and highlights the implications of overconfidence in misinformation detection.

The study surveyed participants from Canada and the U.S. to evaluate 20 statements on a B.S. receptivity scale as profound or not profound, with their B.S. detection accuracy score calculated based on their ability to distinguish real statements from nonsense.

Participants were also asked to rate their confidence in their own B.S. detection ability and estimated their peers' performance as well. 

Based on the results, the researchers found that people who are the worst at detecting B.S. not only believe they are considerably better at it than they are but that they are better at it than the average person. This metacognitive ignorance, dubbed the “B.S. blind spot,” exposes the vulnerability of individuals more prone to falling for nonsense.

Conversely, those who are the best at detecting B.S. were not only underconfident in their performance, but they also mistakenly believed they were worse at it than average. This belief was called “B.S. blindsight.”

The bias blind spot, a related concept, suggests that people generally perceive themselves as less biased than others. This parallels the "B.S. blind spot" in the sense that overconfidence in one's abilities to identify misinformation is a cognitive bias, or a tendency of the human brain to selectively filter and process information according to personal experiences and preferences. The implications are significant, as these biases affect an individual’s susceptibility to misleading information.

Littrell and Fugelsang acknowledge that their research has focused on Western, English-speaking populations. They caution against overgeneralization and recognize the need to account for cross-cultural differences in perception and response to B.S. Understanding these variations is crucial for developing a comprehensive understanding of how people perceive and react to misinformation across the globe.

“Awareness is the first step. We need people to understand that there is misinformation out there, and we need to find ways to address it and reduce it,” Fugelsang says. “On a more long-term strategy, we need to teach misinformation awareness, healthy skepticism and encouraging reflective thinking early on in life — as early as elementary school.”

For adults, they suggest anyone encountering information related to health, finance or politics should slow down and critically reflect on what’s presented to them, especially online. This pause and reflection can help them to become discerning consumers of information, making them less likely to fall for misleading content.

“There are various ways we encounter this in our daily lives. Currently, our top priority lies in addressing the propagation of B.S. and misinformation, especially when it influences matters like your money, your health and your vote,” Littrell says. “Take a step back, and be skeptical, especially regarding those three areas."

Littrell and Fugelsang underscore the difficulty of correcting misinformation once it has spread widely. The B.S. asymmetry principle suggests that the effort required to correct misinformation is far greater than the effort to create it.

“Our goal is to create more effective interventions by helping individuals recognize these challenges. However, it's essential for people to be aware of these issues before we can make progress,” Littrell explains.

The proliferation of misinformation in the digital age is a growing concern, and its dissemination through online platforms only magnifies the challenge.

Next up: applying their findings to both business and political contexts by identifying and mitigating the use of jargon and persuasive B.S., and ultimately helping people discern rhetoric from substance.

In an age where misinformation can spread like wildfire, this research is not just an academic pursuit to understand the human mind but a practical effort to protect and empower people. As they continue to unravel the intricacies of the human mind when it comes to detecting and understanding B.S., their work might help develop a roadmap to avoid stepping into it.