Fake news mind traps

Understanding cognitive bias is a key to beating the modern-day media scourge

Protesters displaying a sign referring to Fake News in Washington, during the Women’s March, on Jan 21, 2017. PHOTO: AFP

Debates surrounding how to counter fake news typically focus on what regulation, technology companies, media literacy and fact-checking can do.

The focus on different types of counter-measures, while important, tends to omit the one thing that is common to all types of misinformation - the human psychology behind information use. Understanding the root cause of why people respond to fake news, or misinformation in general, is key. This is because the success of any counter-measure ultimately hinges on the individual.

A common assumption is that people are rational - they will do the right thing when confronted with information. This includes evaluating the source and the presence - or absence - of evidence. Such an assumption is a logical one, especially when it is easy to verify articles with just a few key strokes.

So why do people still fall prey to fake news? A key explanation is that human beings have inherent biases that affect their ability to discern the truthfulness of a piece of information. Understanding the cognitive biases - confirmation bias, motivated reasoning and optimistic bias - that shape responses to information is a necessary first step in deciding how to counter fake news.

In the late 1970s, researchers from Stanford University conducted an experiment involving students who held different positions concerning capital punishment. As expected, students who supported capital punishment found "data" which showed capital punishment had a deterrence effect to be highly credible. The same group rated data that indicated that capital punishment did not have a deterrence effect as unconvincing. The reverse was true for students who were against capital punishment.

Hence, at "first touch", hardwired biases are already affecting the response to information. The Stanford example shows confirmation bias at work - assessing new information based on how compatible it is with pre-existing beliefs. The greater the compatibility, the more likely the new information will be accepted as true. The lower the compatibility, the more likely we will reject it.

The same bias applies to how people process information on homosexuality. Studies have found that individuals with higher prejudice against homosexuals find pseudo science that confirms their stereotypes more convincing than individuals with lower prejudice do.

Human beings are also creatures of comfort, even at the cognitive level. To reduce mental discomfort that arises from incompatibility between the information received and pre-existing beliefs, people engage in motivated reasoning. This happens when information is evaluated in a way that suits desired conclusions and outcomes.

In the case of fake news, it explains why people may spread a piece of news even if they are not sure of its veracity - they justify to themselves that it is better to be safe than sorry, that they are doing the "right" thing, such as warning family members and friends of potential danger.

WHY DO ECHO CHAMBERS PERSIST?

Much has been said about the dangers of echo chambers since the early days of media use. This phenomenon is not exclusive to social media. However, such a problem has persisted. Why is this so? Could this be explained by people's belief that they are not susceptible to the dangers of filter bubbles, unlike others around them?

Optimistic bias, the often faulty belief that one is less susceptible to risks than others, has been demonstrated among people of all ages and different backgrounds. It encourages complacency and prevents people from taking precautionary measures against problems such as echo chambers and fake news.

While the government, media, technology companies and educators look at what can be done to combat fake news, it is important to be aware of the powerful influence of cognitive biases.

There are two implications for media literacy. One is to include psychological perspectives in the media literacy curriculum. Curriculum designers could include real-life examples to make the concepts more accessible. They could also incorporate activities that encourage students to reflect on their prejudices and biases, and their susceptibility to being trapped in information filter bubbles.

Information users should also be urged to consider the consequences that the information they share will have on their loved ones. While they may be motivated by benign intentions, the outcomes could be anxiety and fear on the part of family members and friends. The message should be clear - check before you share.

These measures can help to reduce the negative effects of all three biases.

Finally, the frequent targets of fake news and misinformation - corporate entities and public institutions - should not underestimate the power of human bias. They should acknowledge and incorporate evidence on prejudices stemming from people's cognitive biases in their communication with the public.

It is increasingly insufficient to just have the skills to call out fake news and misinformation. An understanding of psychological motivations is necessary, too.


  • Elmie Nekmat is assistant professor at the Department of Communications and New Media at the National University of Singapore. Carol Soon is a senior research fellow from the society and culture research cluster of the Institute of Policy Studies.

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Straits Times on September 22, 2017, with the headline Fake news mind traps. Subscribe