Social media platforms can be ordered to remove content, say law students

SPH Brightcove Video
Lawyer Sui Yi Siong and a group of Singapore Management University undergraduates recommend targeting intermediaries, or social media platforms, as a practical way of dealing with online falsehoods.
(From right) Law undergraduates from the National University of Singapore, Er Shengtian, Rachel and Joel Jaryn Yap Shen, Singapore Management University (SMU) students Simran Kaur Sandhu and Gloria Chan Hui En, lawyer Sui Yi Siong and SMU undergraduate Chen Lixin, speaking on the last day of the Select Committee public hearing on March 29, 2018. PHOTO: GOV.SG

SINGAPORE - Requiring technology and social media platforms to monitor content - and holding them liable if they do not take down offending content in certain cases - is the most feasible way for Singapore to deal with the scourge of online untruths.

On Thursday (March 29), the last day of public hearings held by the Select Committee on deliberate online falsehoods , a lawyer and a group of Singapore Management University (SMU) undergraduates made this recommendation, noting that the "true mischief" of the problem lies in how these untruths can be disseminated to a virtually unlimited audience.

In their written submission, lawyer Sui Yi Siong and SMU undergraduates, Mr Lyndon Choo, Ms Chen Lixin and Mr Aaron Yoong, outlined three possible models to restrict access to such falsehoods on intermediary platforms.

The first gives intermediaries broad immunity from liability, and does not hold them responsible for content that they spread, or require them to monitor their platforms. But this is "wholly inappropriate" for Singapore, given the dire consequences of allowing falsehoods to spread, they argued.

The next approach, which treats such platforms as if they are the content creators, requires them to remove content which may be unlawful or they will face penalties. But this is not practical, they argued, as they are ill-equipped to judge what content crosses into unlawful territory, and may lead to self-censorship if platforms err on the side of caution.

This leaves Singapore with a "safe harbour" model that strikes a balance between the two approaches. In clear-cut instances, unlawful content should be removed as fast as possible by the platform when it is alerted to it by users. It does not need to wait for a judicial order.

In other less straightforward instances, the platform should wait for a judicial order on the appropriate course of action. Regulatory bodies like the Info-communications Media Development Authority can also take on the task of regulating unlawful content, if the courts have limited resources, they said.

Asked by Minister for Social and Family Development Desmond Lee about the extent to which national legislation should cover state actors, if they are found to be responsible for spreading falsehoods, Mr Sui said state actors cannot be directly prosecuted.

But he added that through the "safe harbour" approach that they have recommended, Singapore can ensure content platforms can take down offensive posts so that its citizens are not subject to them.

Another group from SMU which appeared on the same panel suggested that existing laws be amended to tackle the problem, noting a gap in existing legislation.

The definition of falsehoods under the Telecommunications Act can be consolidated, said Ms Simran Kaur Sandhu, Ms Gloria Chan , Mr Daryl Gan and Mr Cheah You Yuan.

Any publication that is "manifestly materially false" and appears to be an "unlawful publication" will be considered a falsehood. Such publications may contain content that incites violence, or promotes enmity between different groups on grounds such as religion or race.

A multi-pronged approach can be considered when it comes to blocking access to deliberate online falsehoods, they said.

Regulation can be introduced to make social media platforms responsible for monitoring and removing false content, or the court and minister can make an order for a post to be taken down, depending on the severity of the post in question. The minister can make an order, for example, if a post gravely threatens Singapore's security.

Laws requiring the disclosure of sponsors for content can be considered too, they added.

Often, online falsehoods may have vested interests in influencing the public to vote in a certain way, or discredit other individuals. Such a law can reveal these motivations and enhance transparency, they said.

Public hearings to fight online falsehoods: Read the submissions here and watch more videos.

Join ST's WhatsApp Channel and get the latest news and must-reads.