7 themes from 8 days of public hearings on deliberate online falsehoods

SPH Brightcove Video
Charles Chong, chairman of the Select Committee on deliberate online falsehoods, summarised some themes raised by witnesses during the eight days of public hearings.
The 10-member Select Committee on deliberate online falsehoods, made up of ministers and MPs, heard from around 70 speakers at the eight days of public hearings. PHOTO: GOV.SG

SINGAPORE - A high-level parliamentary committee looking into how Singapore can thwart deliberate online untruths wrapped up on Thursday (March 29), after a record eight days of public hearings.

The 10-member Select Committee on deliberate online falsehoods, made up of ministers and MPs, heard from around 70 speakers at the hearings, which started on March 14.

The speakers included Singapore students and academics, as well as overseas academics and representatives of media and technology companies such as Facebook and Google. Some travelled here from countries such as Indonesia, Germany and Ukraine.

The exchanges got somewhat testy in a few cases, notably when Home Affairs and Law Minister K Shanmugam last week grilled Mr Simon Milner, Facebook vice-president of public policy for Asia-Pacific, over data breaches affecting millions of its users. Likewise, it got a tad tense on Tuesday at a session involving civil society activists which lasted five hours.

Here are the main themes that have emerged:

1. Difficulty in defining online falsehoods

When disinformation contains a grain of truth, it becomes extremely tough to define what is hyperpartisan, what is unbalanced or incorrect, and what is deliberately false, a senior fellow at an American think-tank said on the second day of hearings.

In a debate with Mr Shanmugam, Dr Ben Nimmo said that if a legal approach is considered, "just the preamble to your legislation is going to be the size of the Oxford English dictionary".

Last week, representatives of the Singapore Press Club and Singapore Corporate Counsel Association also highlighted the need for a clear understanding of the definitions of deliberate untruths if new laws are drafted.

"There will be a lot of grey areas," said Singapore Press Club president Patrick Daniel. "I would submit that in current legislation, and in current codes of practice it is not so clear."

Committee member Janil Puthucheary put to the groups that some said factors like intent, whether information can be proven false, and their impact on society, could help in defining disinformation.

While the two groups agreed that there is a spectrum of falsehoods, some of which ought to be stopped, they added that there is a need to deal fairly with genuine errors and allow space for satire and parody as part of media and creative work.

But National University of Singapore law professor Thio Li-ann said a legal definition of deliberate online falsehoods is possible. She added that it is a question of procedure, although certain facts are easier to establish than others.

Meanwhile, Dr Norman Vasu of the S. Rajaratnam School of International Studies (RSIS) said the question of what should be defined as true or false should not be left in the hands of a select few.

2. Online falsehoods as a national security threat

SPH Brightcove Video
Foreign actors who want to attack Singapore could use disinformation campaigns instead of military means. This was discussed at the first public hearing on deliberate online falsehoods on Wednesday (March 14).

Conflicts can be won without a single bullet, said RSIS defence specialist Michael Raska.

Disinformation can be used instead to intrude on another nation's sovereignty, he added.

His RSIS colleague, Dr Gulizar Haciyakupoglu, in turn warned of signs in recent months that an information warfare against Singapore is already under way, with an unnamed country trying to influence minds and legitimise its actions by putting out its narrative through news articles and social media.

She was one of two experts who testified in closed-door hearings.

The other, Dr Damien Cheong, also a national security expert with RSIS, said the goal of a state-sponsored disinformation campaign is to destabilise government and society.

He added that Singaporeans could share untruths without malicious intent and that neighbouring countries such as Malaysia have cyber armies that could be deployed against the Republic directly or as proxies for other nations.

A recurring theme in the hearings was also that Singapore is particularly vulnerable to the problem of deliberate falsehoods, which capitalise on existing fault lines in society.

Associate Professor Kevin Limonier of the French Institute of Geopolitics at the University of Paris 8 noted that there would not be any "vectors" passing on false messages if there were no local audience for them - a weakness that external actors exploit.

Dr Shashi Jayakumar, who is head of RSIS' Centre of Excellence for National Security, warned that Singapore could also be a "sandbox for subversion" due to its smart nation push and social media penetration.

SPH Brightcove Video
There must be some active human agency involved together with any online counter-measures, said Dr Shashi Jayakumar of the S. Rajaratnam School of International Studies on Thursday (March 15) at the second hearing on deliberate online falsehoods.

But Singapore Management University (SMU) law don Eugene Tan sounded an optimistic note, adding that Singapore has not been badly affected by falsehoods spread locally and that this may reflect some level of internal resilience. He proposed that more research be done to understand the ecosystem here.

3. Will countermeasures stifle free speech?

A common concern raised was whether measures against online falsehoods would impinge on free speech, causing a chilling effect on society.

Mr Jakub Janda, head of the Kremlin Watch Programme in Prague, cautioned that government action to define, investigate and take down information could be seen as curbs on free speech.

Similarly, Dr Nimmo added that he is "very wary of any legislative proposal... because the precedent for countries which are hostile to democracy would be very, very alarming".

Members of the civil society here urged against measures that will further restrict the freedom of expression.

Freelance journalist Kirsten Han said there are already laws dealing with certain forms of speech here, including those that wound the religious feelings of others.

SPH Brightcove Video
Journalist Kirsten Han and former The Online Citizen editor Howard Lee debate with Law and Home Affairs Minister K. Shanmugam on what and who defines what is true at the Select Committee on deliberate online falsehoods on Tuesday.

Human rights group Maruah warned that new laws to counter falsehoods could be used to stifle legitimate, dissenting views.

However, Prof Thio and Mr Gaurav Keerthi, founder of debating websites Dialectic.sg and Confirm.sg, cast doubt on whether a marketplace of ideas - in which the truth emerges after a fair contest in which every one speaks his mind without constraint - can truly take place on social media today.

Mr Keerthi said the social media space is a "poor proxy" for such a marketplace, as information is not spread in a transparent way or based on how true it is, but manipulated by algorithms.

Prof Thio noted that the spread of disinformation impedes public debate and destroys the very reason for free speech itself.

4. Are there gaps in the law?

Over the last three weeks, various academics have pointed out gaps in existing laws to deal with online untruths.

SMU law dean Goh Yihan, whose views were repeated regularly at the hearings by committee members, said existing legislation such as the Sedition Act and Telecommunications Act are limited in scope, speed and adaptability in dealing with deliberate online falsehoods.

For example, an offence under the Telecommunications Act does not apply where a person is unaware that a message is fabricated, and the penalty is targeted at the offender rather than the falsehood itself. This means the disinformation may remain online while prosecution is ongoing.

Institute of Policy Studies senior research fellow Carol Soon said in turn that "in view of the changing online environment, it is timely for the Government to review existing legal provisions and strengthen them".

Legislation must safeguard against harmful content and current laws do not sufficiently address the virality of online falsehoods, she added.

While technology companies said new legislation was not needed here, Mr Jeff Paine, managing director of the Asia Internet Coalition, conceded after being quizzed that there could be gaps in Singapore's laws, preventing quick action from being taken against online falsehoods.

5. Beware the backfire effect

SPH Brightcove Video
Professor of media studies at the Hong Kong Baptist University, Dr Cherian George, said that legislation against hateful expression can backfire at times. He called for the repeal of Singapore's insult law - Section 298 of the Penal Code.

If new laws are needed to counter online falsehoods, these laws should acknowledge unintended consequences and avoid overreach into legitimate speech, said media studies professor Cherian George.

"The law, no matter how well written, may have limited impact because of the threat of backfire," he added.

He cautioned that legislation which prohibit the wounding of feelings may be used as a weapon instead by the most intolerant groups against moderates.

Prof George also noted that the law is sometimes the first resort when it comes to clear cases of incitement.

If a politician or preacher stands up and says that Singapore has no room for a particular community, it is not the time to distribute media literacy leaflets, he said, adding that one should "throw the book at him".

Similar worries of a "backfire effect" were echoed by freelance journalist Kirsten Han, as people may see the removal of a post as oppression and censorship, causing a situation to become more inflamed.

6. What role should tech companies play?

Technology companies, such as social media platforms, should play a more proactive role in detecting and removing online falsehoods, said many over the course of hearings.

Some experts pointed to the growing importance of social media in military campaigns.

Noting it is easy to create the impression that an opinion is popular online, Mr Morteza Shahrezaye of the Technical University of Munich called on social media firms to be more transparent with the goal of their algorithms. These determine which posts become more widespread.

Governments should step in to hold platforms responsible for content they allow to be spread, after it has been found to be false, he added.

But some, like Professor Hany Farid of Dartmouth College, said there has been a "pattern of denial and inaction" from social media firms in tackling inappropriate content on their platforms.

SPH Brightcove Video
Professor Hany Farid of Dartmouth College disagreed with tech firms' response that they actively prevent the spread of inappropriate content. Instead he felt that they have displayed a "pattern of denial and inaction".

Appearing before the Select Committee, technology firms Facebook, Twitter and Google argued that they have adopted measures to tackle falsehoods.

They have invested heavily in technology and schemes, developing algorithms that can flag less trustworthy content and prioritise authoritative sources. They also have partnerships with non-profit organisations that help identify and take down offensive material.

Yet, pointing to the way Facebook handled a major data breach in recent years, Mr Shanmugam questioned if the social network can be trusted to cooperate in the fight against online falsehoods.

He questioned Facebook's Mr Milner on how the platform had known about a data breach affecting more than 50 million of its users back in 2015, but did not admit to this until earlier this month.

SPH Brightcove Video
Facebook's Asia-Pacific vice-president of public policy, Simon Milner, said it could have done better in terms of their data, when he was quizzed by Home Affairs and Law Minister K. Shanmugam on the fourth day of public hearings.

In the breach in 2014, personal data of these Facebook users were inappropriately obtained and shared with British political consultancy Cambridge Analytica.

7. Are there non-legislative measures we can consider?

From arming the public with critical thinking and media literacy skills, to having fact-checking groups to help debunk falsehoods, a variety of non-legislative measures to counter fake news were suggested.

Associate Professor Alton Chua of Nanyang Technological University called for the National Education curriculum to be expanded, to cover the moral, legal and social implications of falsehoods. He also suggested supporting and growing fact-checking online communities.

Media company Singapore Press Holdings and broadcaster Channel NewsAsia proposed having an independent fact-checking body to identify deliberate online falsehoods and recommend appropriate remedial actions.

SPH Brightcove Video
Any legislation introduced to counter online falsehoods should not be so broad and sweeping that it chills the legitimate sharing of information and endangers the work of journalists, said SPH English/Malay/Tamil Media Group editor Warren Fernandez.

RSIS senior fellow Benjamin Ang in turn suggested setting up an independent body of non-governmental experts, who can aid in assessing if falsehoods identified are part of a larger disinformation operation. If so, a strategic rather than reactive response should be taken.

Meanwhile, civil society representatives suggested improving media literacy and political education, as well as having a Freedom of Information Act.

Public hearings to fight online falsehoods: Read the submissions here and watch more videos.

Join ST's WhatsApp Channel and get the latest news and must-reads.