THE END OF THINKING
Most of the information we spread online is quantifiably “bullshit”
by Nathaniel Barr | Professor of Creativity and Creative Thinking
The internet encourages the spread of information that is emotionally resonant but factually untrue. (AP Photo/Mic Smith)
In his well-known essay On Bullshit, Harry Frankfurt defines bullshit as speech that is designed to impress but lacks a direct concern for the truth. Under such a definition, a large portion of what we read online today is likely to be bullshit.
Some types of bullshit are political in nature, such as the misleading claim that only 16 mass shootings took place under President Bush’s tenure, compared to a whopping 162 under President Obama. Such claims are valued for their persuasiveness in making a point, rather than for their connection to reality. Other bullshit, such as clickbait, is motivated by the commercial mandates of the digital age, in which companies endlessly chase more page views, likes, followers, subscribers and customers.
Still more bullshit springs from vanity and hunger for attention. Reddit rewards users with “karma” for popular comment and link contributions, a system that can compel people to share bullshit or create their own. Twitter, with a 140-character limit, seems exceptionally conducive to the spread of bullshit, as the brevity of the medium demands vagueness disguised as pith.
The internet has ushered in the Age of Bullshit.
Considering that the internet has greatly increased our access to unreliable information, and that bullshit still passes through more traditional channels such as newspapers, magazines, television, radio, and face-to-face conversations, it seems reasonable to suggest that people today are inundated with more bullshit now than ever before. The internet has ushered in the Age of Bullshit.
Yet despite the prevalence of bullshit, it has only sparingly been discussed from an academic perspective. Frankfurt’s famous essay effectively explored the essence of the bullshitter. But it seems no research to date has explored the characteristics of the bullshittee. That is, what type of people are most likely to believe in bullshit? Given the ubiquity of bullshit online, this seems like an especially important issue to address sooner rather than later.
To begin studying the bullshittee from a psychological perspective, some of my colleagues and I, led by Gordon Pennycook, undertook asystematic empirical analysis of the reception and detection of a particular type of bullshit: the pseudo-profound. Pseudo-profound bullshit refers to statements that are meant to imply deep meaning but are actually vacuous. With opaque language, one can imply much while saying little.
What type of people are most likely to believe in bullshit?
To assess the reception of such bullshit, we presented approximately 800 participants across four studies with statements ranging from the mundane to the meaningful. We included some bullshit too. To produce the bullshit, we relied on websites that arrange buzzwords into arbitrary but syntactically valid sentences. For example, some buzzwords were drawn from sources such as New Age icon Deepak Chopra’s Twitter feed, and arranged into pronouncements like “Hidden meaning transforms unparalleled abstract beauty.” We asked participants to tell us how profound they found these statements, and correlated these ratings with other psychological variables.
A clear pattern emerged in the types of people who were more likely to find profundity in the meaningless. People who were more religious, more likely to believe in the paranormal, and more accepting of alternative medicine were more receptive to the bullshit. People who were less analytic and intelligent were also more likely to find the bullshit statements to be profound than their more reflective and intelligent counterparts. Our research also suggests that people who are generally biased toward finding things profound are more receptive to bullshit.
One’s tendency to believe bullshit might well be considered a mental shortcut.
In other words, the participants in our study didn’t seem to be thinking deeply about seemingly vacuous statements and constructing meaning from them. Instead, they were uncritically accepting the (rather blatant) bullshit as meaningful based on how it felt when they first encountered it. Therefore one’s tendency to believe bullshit might well be considered a mental shortcut. When intuitive thinkers are presented with seemingly impressive words, they may assume deep meaning without engaging in analytic thinking to reflect on whether there is something more to it.
The very nature of the internet may encourage a shallow kind of information processing that facilitates belief in bullshit. Nicholas Carr, for example, has argued that the internet is transforming us into skimmers. Rather than dive into the world of words, we prefer to superficially skip along the surface, darting between open tabs and blinking messenger windows.
Empirical research supports the idea that using the internet shapes the way we think. Betsy Sparrow, an assistant professor of psychology at Columbia University, and co-authors have found that people use the internet as an external memory storage system, making them less likely to remember the information they look up. And researchers at Yale University recently demonstrated that searching online for information leads us to over-estimate how much we actually know and understand. Our tendency to offload cognitive functions to our computers forecasts a future in which we think less about what we encounter online, which could have consequences for our receptivity to bullshit.
The correlation between lower levels of analytic thinking and receptivity to bullshit is particularly important when it comes to helping us understand why people can find meaning in meaningless statements online. Intuitive thinkers—those who are more likely to rely on their initial impressions when reasoning—rated meaningless statements as more profound. Reflective thinkers, who are more likely to reconsider their initial impressions after giving a subject analytic thought, saw through the bullshit.
Intuitive thinkers may not only be more prone to accepting bullshit, they also might be more likely to find it.
Interestingly, in other research exploring the relationship between people’s thinking styles and the way they use technology, we have shown that more intuitive thinkers are relatively more likely to rely on search engines for information. Reflective thinkers seem less prone to offload their thinking to devices. Thus, intuitive thinkers may not only be more prone to accepting bullshit, they also might be more likely to find it.
But while less analytic types are more prone to buy into bullshit, all of us frequently fail to engage in reflective thinking. Decades of psychological research clearly shows that people tend to be cognitive misers, only thinking hard about things when they must. Thus, we are all likely to fall prey to bullshit at some point.
Sometimes believing bullshit might be relatively inconsequential. Being overly impressed with an exaggerated story of how a Facebook friendshared a meal with Johnny Depp on a big night out is unlikely to ruin anyone’s life.
However, our susceptibility to some kinds of bullshit may be more costly.
People tend to be cognitive misers, only thinking hard about things when they must.
For example, presidential candidates in the upcoming US election are slinging bullshit at an alarming rate (perhaps some more than others). Their bullshit can easily spread online, where legions of supporters take to Twitter and Facebook to talk politics despite the fact that many of them are likely ill-informed.
Donald Trump, for instance, has repeatedly claimed that thousands of Muslims in New Jersey celebrated the tragedy of 9/11. Many of his supporters took to social media to back his view, despite abundant evidence that this was bullshit. Trump displays callous disregard for the truth in order to impress certain demographics of voters. The spread of this kind of bullshit serves to further ostracize a marginalized group. It is but one manifestation of the potentially severe consequences of bullshit in politics.
Also worrisome is the prospect of the spread of bullshit related to health and medicine. For example, research has shown that the internet has played a key role in the rise of the anti-vaccine movement. Anti-scientific sites that argue vaccines are ineffective and dangerous are common and influential. Researchers have found that convincing anti-vaxxers that this information is bullshit is extremely difficult. But people who refuse to vaccinate their children risk dire consequences for themselves and the rest of us.
Research also suggests that some bullshit information regarding alternative treatments found online can pose significant harm to cancer patients. Many have experienced complications or died because they rely on alternative health treatments of the sort readily found online. (High-profile examples of cancer patients who avoided mainstream medicine for alternative treatments include Penelope Dingle and Steve Jobs.) Even relying on such treatments as complements to traditional medicine is risky, since these approaches can interact with more effective treatments in dangerous ways.
It’s not only people on the fringes who may be tempted to accept advice from less than reputable sources online.
It’s not only people on the fringes who may be tempted to accept advice from less than reputable sources online. Deepak Chopra has amassed millions of Twitter followers, despite a multitude of scientists and skeptics calling his views on even the most basic aspects of biology and physics unscientific and inaccurate. These views include the denying of evolution and misrepresenting quantum theory. (See here, here, here and here for examples of scientists taking issue with Chopra’s claims.)
In our research, we discovered that people who found Chopra’s tweets to be profound also tended to think that bullshit (here defined as random buzzwords arranged into syntactically valid sentences) was profound. When Chopra learned of our research and this unflattering result on Twitter, rather than counter with evidence that his arguments are based in truth, Chopra thanked us, as it is “getting [him] more speaking engagements and new book offers.” This response exemplifies the desire to persuade and profit, rather than to be precise, which characterizes the Age of Bullshit.
The task ahead is to help people understand how to separate the signal from the noise online. After all, although the internet has exposed us to more bullshit than ever before, it has also given us more access to accurate information. The public has the potential to participate in the political process at an unprecedented scale, and to make use of the incredible access to information in all areas of life. We have at our fingertips the sum collective knowledge of the world’s many experts, and the opportunity to freely share insights from these experts instantly and freely.
The task ahead is to help people understand how to separate the signal from the noise online.
Thankfully, the most consequential types of bullshit seem to be the easiest to refute. Though it may be difficult to determine whether your Facebook friends are as socially in demand as they suggest, finding the answers to issues of real consequence has never been easier. Rather than rely on the word of a person we know or simply decide the truth for ourselves, we can draw information from a variety of sources online to best cut through the bullshit.
Based on the preliminary research we have conducted so far, two general remedies for being overly receptive to bullshit are to receive more education—especially about what constitutes a good argument and evidence—and to more frequently engage in reflective critical thinking.
Because people naturally wish to limit analytic thought, which is costly in both time and effort, these remedies are likely insufficient in the battle against bullshit unless they can be made palatable. An increasingly important challenge for those who have access to truth, whether it be expertise in politics, science, or other subjects, is to find ways to convey the truth in impressive ways that both inform and entertain.
Accurate but overly technical or otherwise inaccessible copy is unlikely to compete with catchy clickbait and cyber snake-oil salesman, no matter how much truth may be found therein. Translating complex ideas into digestible content will play an important role in encouraging people to read and share reliable information.
People naturally wish to limit analytic thought.
There are positive trends in this direction, whereby the same platforms that often spread bullshit are being used to disseminate the truth. Reddit’s askscience subreddit, for example, affords people an opportunity to ask experts about issues of interest or concern. Many popular scientists and academics are tackling bullshit via Twitter. The internet is a double-edged informational sword.
We all bear responsibility in the war against bullshit. We should discourage bullshitters by resisting the temptation to cave to the clickbait and contribute to page views. We should hesitate to spread articles that provoke immediately strong emotional responses but lack reasoned arguments. Bystanders with knowledge of a given area must continue to call bullshit on charlatans in a way that encourages reflective critical thinking. And simple awareness that intuitive assessments can lead to us to fall prey to bullshit may help us to check our instinctual reactions to what we read online, encouraging us to think again–and more deeply.
So rather than lament that we live in the Age of Bullshit, we should strive to make best use of the internet, with the hope of eventually finding our way into a more sensible age. It is crucial that our positions regarding social, political and scientific issues be rooted in evidence, of which more is available now than ever before, rather than the emotional impressions that bullshit seeks to evoke. And we should all strive to avoid, wherever possible, spreading our own bullshit.
We welcome your comments at firstname.lastname@example.org.