No evidence to show that online falsehoods can change people's political opinions: Expert

Mr Morteza Shahrezaye of the Bavarian School of Public Policy, Technical University of Munich, said there is no proof that online falsehoods can change people's political views, though these can be used to manipulate political agendas.
Mr Morteza Shahrezaye of the Bavarian School of Public Policy, Technical University of Munich, said there is no proof that online falsehoods can change people's political views, though these can be used to manipulate political agendas.PHOTO: GOV.SG

SINGAPORE - There is no concrete evidence to prove that online falsehoods can change people's political views, though these can be used to manipulate political agendas by creating false impressions, said a political data science researcher based in Germany.

Despite such efforts to change people's political opinions, there is no proof that they work, as far as he knows, Mr Morteza Shahrezaye of the Bavarian School of Public Policy, Technical University of Munich, told the parliamentary Select Committee on fake news on Friday (March 16).

"And there's even evidence against it," he said.

He noted that social media platforms like Facebook and Twitter have the means to detect and put a stop to fake news spread by automated bot networks.

In a joint submission by Mr Shahrezaye and his university colleague, Associate Professor Simon Hegelich, they argued that fears of orchestrated attempts to transform political opinion on social networks are "exaggerated".

"It is very unlikely that anyone is changing his or her mind on important political issues just because of some suspicious accounts in social media," they wrote.

Mr Shahrezaye disagreed with Select Committee member Edwin Tong who cited how an earlier speaker had said that it was difficult for a person to tell a real social media account from a fake automated one.

"I can kind of claim that a simple user with bare eyes can detect if a user is a bot or not," said the researcher, adding that an account that issues 10,000 Tweets in a day, and the same Tweet at that, will raise suspicions.

But Mr Shahrezaye acknowledged there have been cases where the political agenda has been influenced by online fabrications.

It is easy to create the impression that an opinion is very popular online, he noted. "If someone, with the help of automation or the help of others, is systematically engaging in a political discussion with the goal to raise the amount of posts, hashtags, likes (and so on), the debates becomes relatively popular."

He added: "Journalists, politicians or normal citizens might fall for these wrong trends and comment on them, thereby making them even popular."

The two Technical University of Germany researchers cited how right-wing groups in Germany had tried to create a negative trend on social media during the refugee crisis in the country that started in 2015.

They said in their written submissions that it remains inconclusive, however, "whether any politician fell for it" and tried to pander to this negative mood online.

Asked about this by Mr Tong, who is also MP for Marine Parade GRC, Mr Shahrezaye said that while neutral parties may be influenced by the anti-refugee sentiment online, he would not consider this to be a manipulation of public opinion.

But he agreed with Mr Tong's suggestion that the "mischief" behind this trend lies in how it could lead people like journalists and politicians to make "bad decisions" based on wrong impressions of what they come across online. "There was influence (in the sense) that the political agenda was really influenced by social media activity," said Mr Shahrezaye.

He also cited a study by Harvard University's Berkman Klein Center for Internet and Society on the impact of social media on the 2016 United States presidential election. It found that just two of the top 100 most shared election stories on Twitter and Facebook were fake news stories, and did not have significant impact in swaying opinions, he noted.

Mr Shahrezaye suggested that social media platforms and companies can be more responsible and transparent by explaining the goal of their algorithms, which determine which posts become more popular or visible, to both governments and the public.

To this, Law and Home Affairs Minister K. Shanmugam, also a committee member, smiled and said the goal of such companies is to maximise commercial profitability, calling it a classic case where commercial interests may conflict with the public interest.

He added:"And we will then have to decide what is in the public interest and see how their commercial interests and the public interest can be coincided, right? That's the task of every government."

Mr Shahrezaye concurred.

Public hearings to fight online falsehoods: Read the submissions here and watch more videos.