Earlier this month, Facebook gave reporters a sneak preview of its War Room - a command centre that will house a crack team focused on fishing out disinformation, keeping watch on fake news and deleting false accounts that seek to sway upcoming elections in America and Brazil.
The term "War Room" is apt.
Several legislative hearings on the subject of disinformation around the world this year have shown that the tech giant and its operations have - inadvertently and otherwise - facilitated the spread of fake news, with lethal consequences in some cases.
But more significantly, meddling in foreign elections through the spread of disinformation - in part through Facebook - has become a weapon for countries that seek to target neighbours large and small.
And governments around the world are working to raise their defences against fake news.
The matter deserves to be taken as seriously as preparing one's physical armies for battle.
The report of Singapore's parliamentary Select Committee on deliberate online falsehoods earlier this month noted how various national security experts who gave evidence said that disinformation operations are "persistent and permanent".
In other words, expect such falsehoods to be spread even in the absence of an open conflict.
In fact, they can work on "slow burn" issues that prove as pernicious, if not more so , because these operations are hard to detect.
An isolated encounter between individuals from different ethnic groups, or between a local resident and a new immigrant, can easily be magnified and blown out of proportion on social media, to generate the impression of deep-seated hostility.
Such sentiments have fanned riots elsewhere in Asia and Europe.
Other tools used by foreign states or players include mobilising different sections of the population, infiltrating local non-governmental organisations, bribing or paying off politicians, and staging cyber attacks.
Singapore, too, has been the subject of foreign, state-sponsored disinformation operations, a security agency told the Select Committee at a confidential briefing. Indicators include a state using news articles and social media to influence the minds of segments of the population in Singapore, and to legitimise that state's actions globally.
Such weaponisation of information - which has occurred across Europe and Asia - seeks to widen fault lines in a society and destabilise it. It also seeks to chip away at trust in public institutions, from established mainstream political parties and civil society groups to mainstream media.
And people are especially vulnerable, for the research shows that deliberate online falsehoods have a clear edge over facts in influencing people across all educational backgrounds.
This is why the Select Committee, and others, found the phenomenon of fake news a real and present danger to society.
ON HUMAN NATURE
Human nature appears to be key in understanding the effectiveness of disinformation. A paper published in the journal Science by three Massachusetts Institute of Technology (MIT) researchers in March suggests that falsehoods tend to spread much faster and more widely on Twitter.
Researchers Soroush Vosoughi, Deb Roy and Sinan Aral looked at around 126,000 tweets that were posted by some three million people more than 4.5 million times, and which were later investigated and debunked as false by fact-checking organisations.
False news also had wider reach: The top 1 per cent of false news posts spread to between 1,000 and 100,000 people, whereas truthful posts rarely diffused to more than 1,000 people.
"Falsehood diffused significantly farther, faster, deeper and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information," the authors wrote.
"We found that false news was more novel than true news, which suggests that people were more likely to share novel information. Whereas false stories inspired fear, disgust and surprise in replies, true stories inspired anticipation, sadness, joy, and trust."
"Contrary to conventional wisdom, robots accelerated the spread of true and false news at the same rate, implying that false news spreads more than the truth because humans, not robots, are more likely to spread it."
This finding helps explain why people tend to be susceptible to foreign disinformation attempts, for it hits at the emotions and is often sensational, whereas the truth may be staid and factual.
But such disinformation would arguably not be as effective in the absence of underlying grievances or tensions that, while minor, could be tapped and exploited.
In an article in the latest issue of MIT Technology Review, techno-sociologist Zeynep Tufekci, who studies the social implications of emerging technologies on politics, was more direct about the role of digital technology in politics in an age of disinformation.
"The Russian government may have used online platforms to remotely meddle in US elections, but Russia did not create the conditions of social distrust, weak institutions and detached elites that made the US vulnerable to that kind of meddling," she writes of the ongoing investigation into Russian influence in the 2016 United States election, when Mr Donald Trump was elected president.
"Even the free-for-all environment in which these digital platforms have operated for so long can be seen as a symptom of the broader problem, a world in which the powerful have few restraints on their actions while everyone else gets squeezed. Real wages in the US and Europe are stuck and have been for decades while corporate profits have stayed high and taxes on the rich have fallen. Young people juggle multiple, often mediocre, jobs, yet find it increasingly hard to take the traditional wealth-building step of buying their own home - unless they already come from privilege and inherit large sums.
"If digital connectivity provided the spark, it ignited because the kindling was already everywhere."
How, then, can societies combat this scourge of fake news?
In Singapore's case, the Select Committee has recommended a suite of measures - from public education and reinforcing social cohesion and trust, to promoting fact-checking and measures to decisively disrupt such falsehoods, as well as to deal with threats to national security arising from disinformation.
But given that tech platforms have their limitations - even as they seek to get their act together - deeper solutions to fight falsehoods must be found.
Where outside forces are likely to use racial issues or cite the class divide to sow discord, then the challenge must lie in strengthening social mobility and a sense of equity across society.
And where individuals may be predisposed to believe only comments or even research that square with their views on issues, or reinforce what they already believe - the recent debate over whether to repeal or retain Section 377A of the Penal Code that criminalises gay sex being a case in point - then perhaps greater exposure to a range of views is needed.
This is where mainstream media, schools and others can play a role in ensuring there is robust but also respectful debate on contentious issues.
It is a sad development that technology has been manipulated, muddying politics worldwide, sowing confusion and discord, swaying voters and dividing societies.
But we must not lose faith in society's ability to also turn technology to its advantage and ensure that the first line of defence against fake news is strengthened.
We have been experiencing some problems with subscriber log-ins and apologise for the inconvenience caused. Until we resolve the issues, subscribers need not log in to access ST Digital articles. But a log-in is still required for our PDFs.