Fighting online falsehoods deliberately, but judiciously

The word "truth" featured prominently in the recent Select Committee hearing on deliberate online falsehoods (DOF). I was uncomfortable that it was viewed as a binary imperative.

This came through especially in the session where committee member, Home Affairs and Law Minister K. Shanmugam, and historian Thum Ping Tjin debated whether Operation Coldstore was effected on legitimate, objective grounds or was politically motivated. Each asserted his truth as the truth. But those watching will need to come to their own conclusions.

Second, facts are a necessary but insufficient condition for establishing the truth. Context and interpretation of the facts are critical.

Let us take the 6.9 million population projected for 2030 in the Population White Paper of 2013. The public reaction was not because of the 6.9 million number per se. People were already feeling the effects of overcrowding in public spaces with a population of 5.1 million and upset about the large influx of foreigners. The same 6.9 million figure, if presented today, may be more acceptable.

One's perception of a fact can be coloured by context and interpretation.

THE NEED FOR INTERVENTION

DOF is a pressing issue that needs to be addressed, and with urgency. It can rapidly erode public trust and result in excessive polarisation. If we do not manage DOF, the online space would be de-legitimised as a reliable marketplace of ideas.

The challenge is to find the right balance between safety and security considerations, and the people's right to access perspectives and freedom to express.

I support the idea of fresh legislation to address the problem. It would serve to fill the gaps which existing legislation cannot address.

That said, legislation should not inadvertently punish the vast majority of Internet users who play by the rules. Built-in safeguards should prevent this. Legislation should not send a signal that the Government is using it as a means to muzzle critical opinion or further narrow political agenda.

A case in point is the recently enacted Anti-Fake News Bill in Malaysia. The controversial Bill was fast-tracked and pushed through despite fears that it might be used to silence criticism against the party in power in the lead-up to the general election.

THREE KEY CONSIDERATIONS FOR LEGISLATION TO WORK

For legislation to work, several considerations must be taken into account. First, there should be clarity and transparency. Intent and motive need to be established. It is not sufficient for legislation to establish that the action was "deliberate". The determining criteria should be "malicious" intent.

Equally important is the focus on content that is likely to have serious repercussions on national interests, such as security and social cohesion.

To help ensure that the assessment is accurate and consistent, I would recommend a three-step criteria of content, intent and impact.

It is useful to take reference from the protocol for "information disorders" proposed by the Shorenstein Centre on Media, Politics and Public Policy at Harvard Kennedy School.

First, there is misinformation - where false information is shared but no harm is meant. Then, there is disinformation - where false information is knowingly shared to cause harm. Third, there is mal-information - where genuine information is shared to cause harm, often by moving information designed to stay in private or confidential space into the public sphere. A case in point is WikiLeaks.

Legislation should also demand greater accountability on the part of Internet service providers (ISPs) and especially large online platforms such as Google and Facebook. Going by the swift actions announced by Facebook in the past weeks, it is clear that there is a lot more these companies can do. For starters, they should have a "complaint management procedure".

Second, I support the idea of an independent "fact-checking body". The Government should seriously consider the creation of an Office of the Internet Ombudsman or a council that is tasked with evaluating cases reported by the public. This body should go beyond fact-checking and make an assessment of the veracity of content, the intent and its impact. It should have the powers of a tribunal to demand information and have limited powers to punish.

When a threat to national security or social cohesion is clear, the Government should act directly and decisively, using the new legislation along with existing legislation such as the Sedition Act and Maintenance of Religious Harmony Act.

As the Government can sometimes be an involved party in DOF cases, it is best that an independent third party acts as the first line of defence. This makes the process fairer and acceptable to the public, and is likely to be less awkward for the Government.

It is important that the fact-checking body serves as a neutral arbiter in form and substance. It must be seen as neutral.

A challenge would be the possible delay in taking down content that is maliciously false or harmful. In Germany, the Network Enforcement Act draws the distinction between content that is "manifestly unlawful" (to be taken down by the platform within 24 hours) and "merely unlawful" (to be investigated over seven days). Platform owners are responsible for taking down the content.

Third, legislation must be part of a multi-pronged approach. It should not unduly hinder the natural development of a "marketplace of ideas".

It is timely to consider legislation that gives the public greater access to information. (Singapore does not have the equivalent of the United States' Freedom of Information Act.) Freer public access to legitimate information would serve as a bulwark against DOF.

Public education and self-regulation by Internet players should be actively facilitated. There could also be a forum where key stakeholders - the Government, ISPs, non-governmental organisations and prominent content providers - come together to discuss pertinent issues and become part of the solution. The Responsible Gambling Forum is an example.

PUBLIC TRUST

Maintaining public trust in the Government and key institutions is critical to winning the battle against DOF.

Given the ambiguous nature of the problem, due process and fairness must take precedence over speed. We should avoid a binary approach, which would have an effect that is similar to hastily throwing water to douse an electrical fire. It is wise to start with a light-touch approach in enforcement, and give the benefit of the doubt when the evidence is not clear enough.

Regardless of what we do to address the problem of DOF, it is important that the impact of our action on public trust remains a bedrock consideration.


• Viswa Sadasivan, a former Nominated MP, is chief executive of Strategic Moves, a corporate strategy and communication consultancy with a focus on public policy.

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Straits Times on April 14, 2018, with the headline Fighting online falsehoods deliberately, but judiciously. Subscribe