3 key strategies to fight fake news

Insight looks at three key proposals among the 22 put forward by the Select Committee on Deliberate Online Falsehoods: a public education framework, measures to counter state-sponsored disinformation campaigns and getting tech companies to step up.

1. Teaching people to tell fake news from real news

In this era of fake news, it has become more important than ever to ensure people are equipped with critical thinking skills, so that they can discern truth from falsehood, effectively interrogate information sources and understand how and why online falsehoods are spread in the digital age.

To that end, the Select Committee on Deliberate Online Falsehoods recommended that the Government set up a national framework to coordinate and guide public education initiatives.

Professor of media and communication Lim Sun Sun from the Singapore University of Technology and Design points out that educators need to go beyond just inculcating media literacy.

Much of the media literacy education today provides people with checklists of telltale signs to assess the credibility of information.

However, purveyors of online disinformation have greater access to technologies that fabricate information to seem so real that it has become very difficult for most people to identify fake news when they see it.

"We need to expand people's knowledge to aspects such as the business models behind online content, online advertising and social media platforms so that they understand how the ecosystem rewards news that is more sensational," Dr Lim says. "As well, we need to explain how individual psychological biases can be exploited to make people more susceptible to online disinformation."

And people are susceptible, even when they think they are not.

A report by market research company Ipsos released last week said that when it asked 750 Singaporeans how confident they were in their ability to distinguish legitimate and accurate news, fake news, false news and alternative facts, 79 per cent said they were at least "somewhat confident".

However, when shown a series of 10 headlines from digital channels and asked to indicate which ones were fake news, 91 per cent incorrectly identified one or more as being real.

The poll also found that 45 per cent of the respondents had been taken in by fake news in the past - they believed a news story was real until they later discovered it was fake.

TECH-SAVVY YOUNG TAKEN IN, TOO

And despite commonly held beliefs that young digital natives may be more discerning about fake news, the survey found that those aged between 15 and 24 are particularly susceptible, with 55 per cent admitting to having fallen for fake news before.

It is not just Singaporeans who lack media literacy skills. The 2018 Reuters Institute Digital News Report, which surveyed news consumers from across 37 countries including Singapore, found that while 58 per cent of respondents said they are worried about fake news, only 26 per cent could identify a recent example of a fake news report that they had seen.

The Media Literacy Council, a group spearheading public education on media and digital literacy, plans to do more to make sure people are better equipped with the skills they need to navigate online news and information, says chairman Lock Wai Han.

For example, the council will be stepping up efforts to help people understand "that their cognitive bias can make them an easy target for fake news, as the way social media is designed makes it easier for online users to select only the information that supports and reinforces their ideas", he says.

The ultimate aim is to "make it second nature for Singaporeans to check what they read online and not to spread news that has not been verified".

Dr Adrian Kwek from the Singapore University of Social Sciences believes that schools can infuse lessons on critical thinking into various subjects at every level.

For example, in a Secondary 2 class on statistics, teachers could include content on how statistics that are true could still be used to mislead.

"Suppose I conduct a poll of 10 people about a controversial topic like abortion. I ask them to rate, on a scale of 0 to 10, whether they think abortion is permissible," he says, adding: "(And the results show) a mean rating of 5, a median rating of 2, and a mode rating of 10. Depending on what my agenda is, I can use the term 'average' to my advantage.

"If I want people to read the news of my poll and believe that the topic is not controversial, I'll say that the average rating is 5. If I want people to believe that the respondents are conservative, I'll say that the average rating is 2. If I want people to believe that the respondents are liberal, I'll say that the average rating is 10."

He goes a step further, saying that critical thinking alone is not enough. Citizens also need to care about truth and the society.

While this is taught in Character and Citizenship Education, and critical thinking is taught by infusing the skill into academic subjects, students need these two fields to be explicitly brought together in the context of fake news, he says.

"We need a particular form of critical thinking - that which is motivated by care for the truth and care for society. Without these two motivators, good critical thinkers can make excellent fake news producers."

2. A national plan to counter disinformation campaigns

The starkest example of the danger of fake news is also the most well-known one - how Russian operators manipulated social media to disseminate disinformation during the 2016 presidential election in the United States.

In its report, the Select Committee highlighted how foreign operators even created fake social media accounts to organise protests in the US. Singapore is not immune to such threats. In fact, the committee said it received evidence that foreign operators have already carried out disinformation campaigns here and can be expected to continue to do so.

It recommended that the Singapore Government consider what additional measures are needed to safeguard the integrity of elections here. It also said Singapore should have a national strategy to counter state-sponsored disinformation campaigns from abroad.

Dr Shashi Jayakumar, head of the Centre of Excellence for National Security at the S. Rajaratnam School of International Studies, says this is much easier said than done. "The aggressive actor, the one that is trying to carry out the subversion, can take advantage of technology, which is advancing at a breathtaking rate," he says.

Mr Jakub Janda, director at the Prague-based European Values Think-Tank, agrees. "Modern technology, such as deep learning, deep 'fakes' or sophisticated bots, provides effective offensive tools of malign influence for state and non-state actors," he says.

"The offensively minded actors, such as Russia, China, extremist, criminal or terrorist groups, have the advantage because democracies are much slower in responding to this development."

Mr Janda, who submitted his views and research to the Select Committee, says the first step is to assess the threat level here.

"You analyse what is the current form and scope of the threat, project how it can evolve in the next five to 10 years, learn from the experience of other countries and tailor your own national policy framework," he says.

US President Donald Trump (far left) and Russian President Vladimir Putin at the Helsinki summit in July. Russian operators were accused of manipulating social media to disseminate disinformation during the 2016 US presidential election, which Mr Tru
US President Donald Trump (far left) and Russian President Vladimir Putin at the Helsinki summit in July. Russian operators were accused of manipulating social media to disseminate disinformation during the 2016 US presidential election, which Mr Trump won. PHOTO: AGENCE FRANCE-PRES

Then, policies and specific action plans can be drawn up.

Sweden is a good example to take the lead from, he says. It began preparing for its elections, which were held earlier this month, two years ago by studying elections abroad, especially ones tampered with by Russia, he says.

It established an election task force and trained thousands of public officials on how to safeguard the integrity of its elections, and published an extensive manual on countering influence operations.

In Britain, an interim report by its Digital, Culture, Media and Sport Committee recently made its own list of recommendations on protecting the integrity of its elections.

These included mandating digital imprint requirements for all electronic campaigning, increasing fines for electoral fraud, establishing an advertising code which would apply to social media during election periods and increasing transparency around digital political advertisements.

NOT JUST THE RUSSIANS

Dr Jayakumar notes that both reports from British and Singapore committees refer to Russia, but warns: "We shouldn't be overly fixated on Russia.

"The techniques and stratagems pioneered by the Russians for a long time - in Ukraine, the Baltics, now in the US and parts of Europe - are transferable and portable, and can be tried by others.

"(But) they should not give anyone the impression that Russian fake news or subversion is a big trend in Singapore - I suspect it is not."

Instead, he says, it might be "more useful to look at the parts in the Select Committee report about the activities of an unnamed Asian country, presumably one much closer to home".

The report makes mention of "an Asian country" that reportedly has an online army of content creators to promote its government's policies and attack criticisms of those policies, both within and outside that country.

In another part of the report, the committee notes that it received evidence about disinformation operations conducted in Asia allegedly by "an Asian country".

This is an important starting point for further study, Dr Jayakumar says: Who is trying to undermine Singapore's social stability and resilience?

Mr Janda notes: "The strategic threat is how the Chinese government tries to use malign influence tools in the wider Asian region.

"We are effectively in an (information) arms race, and we can see that China is learning from what Russia does in Europe in terms of hostile foreign influence operations, and it is being deployed in Asia - and vice versa."

3. Getting tech firms to shut down info feeds and funding

A major motive driving fake news dissemination is financial profit.

To counter this, the Select Committee recommended the Government implement new laws allowing it to cut off the flow of digital advertising funds to shady operators peddling disinformation.

This will require the cooperation of technology companies, whom the committee believes could do more to fight fake news.

It suggested proactive actions they could take, from shutting down inauthentic accounts to increasing the transparency of digital advertising. It also suggested the Government bring in rules compelling tech companies to take steps against fake news.

Among the proposals, one for a "demonetisation" regime could easily be implemented, note experts. Professor Ang Peng Hwa of the Wee Kim Wee School of Communication and Information at Nanyang Technological University says the effectiveness of this has already been shown in the United States, when Facebook, Apple, Twitter, YouTube and Spotify removed podcasts, pages and channels of fake news peddler Infowars.

Prof Ang notes: "Within weeks of the platforms taking action, traffic to the site halved and it's still declining." In future cases, he says, the same could apply. "Any site that sends out fake news on a regular basis would have its feed cut off. Advertisers would be told not to advertise with them through a name-and-shame move."

The Select Committee also recommended new laws to require those who financially benefit from fake news to repay the profits they made. "This should cover the 'hired guns' paid by others to create and spread online falsehoods," it said.

Professor Lim Sun Sun from the Singapore University of Technology and Design says this would mean, for example, a platform such as YouTube withholding payment to a vlogger found to have started and spread deliberate falsehoods.

It is an innovative proposal, she adds. "If you look at legislation around the world, it's had to do with other aspects of regulation, not through demonetisation."

The committee noted that other specific measures tech companies should undertake would vary depending on the platform. For example, on a closed messaging platform such as WhatsApp, Telegram and WeChat, minimising amplification of an online falsehood may involve prohibiting it from being forwarded.

The Select Committee noted that certain measures tech companies should undertake would vary depending on the platform. For example, on a closed messaging platform such as WhatsApp, Telegram and WeChat, minimising amplification of an online falsehood
The Select Committee noted that certain measures tech companies should undertake would vary depending on the platform. For example, on a closed messaging platform such as WhatsApp, Telegram and WeChat, minimising amplification of an online falsehood may involve prohibiting it from being forwarded. ST PHOTO: KELVIN CHNG

However, Assistant Professor Liew Kai Khiun from the Wee Kim Wee School of Communication and Information notes that tech companies face a huge challenge in stamping out falsehoods. "The dissemination and circulation of information is so ubiquitous and voluminous that it would be administratively impossible for tech companies to police the Internet effectively against falsehoods," he says. "For example, how would social media platforms like WhatsApp and Telegram check against the daily circulation of alarmist fake news between users?"

Prof Liew believes any new regulations placed on tech firms should be carefully crafted so they do not cripple Singapore's position as a media and information hub.

During the Select Committee's hearings, representatives from Twitter, Facebook and Google expressed concerns about new rules.

Addressing this at a media briefing last week, committee member Janil Puthucheary said the desired outcome of new laws - that people can trust the engagements that happen online and have some security and confidence in the information that they receive - should be in line with what tech companies themselves want to achieve.

"If our objectives and our intent are all aligned to create a trusted space, trusted platforms, citizen engagement, I can't see why the tech companies would not want to find ways to enable that," he said.

"So we have to have a negotiation about what they think is possible and what we think is necessary and I think we can find a way forward."

The tech companies themselves have been stepping up their efforts. A Facebook spokesman says that over the last year, it has invested in technology and people to combat false news and disrupt attempts to manipulate civic discourse.

"This includes removing fake accounts, disrupting the financial incentives behind false news, reducing the posts people see that link to low-quality Web pages, partnering with threat intelligence experts, and promoting digital and news literacy," she says. Facebook has also introduced a policy that will remove false news with the potential to lead to offline violence, and introduced more ad transparency, she adds.

Google, meanwhile, notes it has improved its search algorithms and introduced new YouTube features to ensure credible information and authoritative news sources are weighted more heavily.

Join ST's WhatsApp Channel and get the latest news and must-reads.

A version of this article appeared in the print edition of The Sunday Times on September 30, 2018, with the headline 3 key strategies to fight fake news. Subscribe