SINGAPORE - Three months after Parliament voted unanimously to form a Select Committee to study the issue of deliberate online falsehoods, the panel completed the public hearings phase of its work last Thursday (March 29).
Over eight days, the committee heard from 65 speakers in public hearings that lasted 50 hours. It also considered 170 written submissions.
It will reconvene next month to deliberate on a report of its findings to Parliament.
A number of clues on what to expect from the committee's report were revealed during the public hearing phase, with committee members showing their thinking through the questions they asked and the themes they focused on.
This week's Insight examines five potential recommendations that could emerge from the fake news select committee.
1. Defining fake news
A key question asked of the Select Committee at the hearings had to do with how deliberate online falsehoods would be defined.
During the exchanges, committee members gave a hint of what factors could be considered in defining it: the deliberate intent of the person or organisation spreading the information; that it appears on an online platform; that it is demonstrably false; and has a significant impact, such as affecting national security or racial and religious harmony.
Home Affairs and Law Minister K. Shanmugam stressed several times that the committee was not dealing with opinions, no matter how irresponsible, as long as they are based on a substratum of facts.
The issue of definition was a point of concern for many speakers.
A few said that any new laws to counter deliberate online falsehoods must be clear in their definitions to avoid catching people who make honest mistakes.
Other witnesses suggested that the Government should not be the sole definer of truth, with some saying that independent bodies should define it instead.
Lawyers and law academics at the hearings generally agreed that it was possible to define a deliberate online falsehood.
On this, Mr Shanmugam said falsehoods are already defined in various places in Singapore's current laws, and the courts have "no difficulty identifying what is false and what is true".
National University of Singapore law professor Thio Li-ann agreed, saying: "It is a question of evidence and procedure and certain facts are easier to establish than others, but we have to do it."
2. Enacting new legislation
The issue of legislation was mentioned every day during hearings, and it is most likely a new law will be among recommendations to counter deliberate online falsehoods.
Several committee members, such as Law and Home Affairs Minister K. Shanmugam, MP Edwin Tong and Nominated MP Chia Yong Yong said several times that they personally felt new legislation was necessary.
This was also the sentiment of Singapore Management University School of Law dean Goh Yihan, who said that the current legislative tools "run up against limitations - of scope, speed and adaptability".
Many recommendations during the hearings involved some kind of mechanism that can quickly remove or prevent access to online falsehoods, whether by compelling people or technology platforms such as Facebook and Twitter to remove the illegal content or blocking them.
Some who supported the idea of a new law also said it should be carefully calibrated, so that it is not too sweeping and broad, and does not punish those who inadvertently share falsehoods or crimp the ability of journalists to do their job.
Several activists and content producers running alternative news websites strongly objected to new legislation, suggesting that it will be used to curtail free speech and stifle legitimate dissenting views.
But some speakers made a distinction between deliberate online falsehoods and free speech.
National University of Singapore (NUS) law professor Thio Li-ann, an expert in constitutional law, said untruths - designed to mislead people, manipulate election results and turn groups against one another - harm society and undermine democracy, and belong to a category of speech that does not warrant protection.
3. Regulating tech companies
At several points over the eight-day hearings, the committee showed a grotesque cartoon of male, ethnic minority migrants abusing a semi-nude white woman and killing her baby - accompanied by a Twitter hashtag "#DeportallMuslims".
It was an image seemingly created to inflame anti-immigrant sentiments around the world. But during a British parliamentary hearing, a Twitter executive said the tech giant would not remove the image as "it was not in breach of its hateful conduct policy".
In Singapore, representatives of tech giants Google, Facebook and Twitter could not answer when Law and Home Affairs Minister K. Shanmugam used this same example to show that self-policing by the tech firms was insufficient.
As self-regulation appears to be lacking, several speakers tell Insight that the committee is likely to deliberate on ways to regulate social media platforms, which in recent years have become very dominant forums carrying a great deal of citizen discourse.
These rules could force tech firms to take more active steps, such as flagging disputed content, strengthening their detection of falsehoods, deprioritising unreliable online news sources and removing fake accounts.
Singapore Management University law dean Goh Yihan tells Insight: "If the platforms provide the service, and they know the impact of their service, then they have a duty to ensure their platform is not misused."
4. Setting up fact-checking mechanisms
Having a fact-checking mechanism was a popular suggestion at the Select Committee hearings, and could be among the measures to be recommended.
While the end goal is to help people verify information, witnesses proposed different ways to do this, including having a fact-checking service, setting up independent fact-checking organisations or supporting fact-checking networks online.
Singapore Press Holdings and broadcaster Channel NewsAsia (CNA) said they were willing to form a fact-checking alliance, which could include not only media companies and industry practitioners, but also other interested parties.
For such an alliance to be seen as a trusted arbiter of truth, it should sit independently from government bodies and commercial entities, they said. It should also have the mandate to identify a deliberate online falsehood.
Defence and strategic studies specialist Michael Raska of the S. Rajaratnam School of International Studies suggested looking at a measure that is already being implemented in the Czech Republic. Instead of having the independent body monitor just news, it should monitor news sites and track their funding and ties to disinformation networks, so as to expose them to readers.
A concern raised by the committee was whether fact-checking services would be able to address serious falsehoods that have national security or public health implications.
Senior Minister of State for Communications and Information Janil Puthucheary, a committee member, asked if an independent fact-checking body could respond fast enough, suggesting that a take- down may be a necessary first step in such instances.
5. Enhancing media literacy
Including media literacy as part of a multi-pronged approach to counter the spread of online falsehoods was a recurring call by academics, students and civil society alike.
The building of such literacy, and urging against the sharing of information without checks, could form long-term measures to counter disinformation, said Mr Benjamin Ang, a senior fellow at the S. Rajaratnam School of International Studies, who was a witness at the hearings.
Singapore Management University law don Eugene Tan noted in his written submission that "there will always be falsehoods deliberately sown; no legislation can, and will, put an end to such activity".
This is why "our first line of defence has to be a population of media-savvy individuals", he said, adding that it would be "too late" to rely on the authorities to step in each time.
Institute of Policy Studies senior research fellow Carol Soon added that there may be a need to enhance what she calls 'critical literacy', which encourages people to question content, their sources and the motivations of creators - an exercise that helps to boost "immunity" to future disinformation, they said.
Beyond media literacy education, media studies professor Cherian George also pointed to political literacy. This includes core political views of equal rights, non-discrimination and religious freedom.
Some countries have underestimated the need for ongoing civic education, he said, with people becoming "more susceptible to intolerant populous disinformation".
"If you are talking about long-term intervention, these are the principles that will make Singapore much more resilient against hate propaganda," said Dr George.
Read the full Insight here.