Real threat is not online falsehood but selective news consumption, says European lawyer

European lawyer and IT expert Dan Shefet speaking at the Select Committee on deliberate online falsehoods, on March 28, 2018.
European lawyer and IT expert Dan Shefet speaking at the Select Committee on deliberate online falsehoods, on March 28, 2018.PHOTO: GOV.SG
The real threat to democracy is selective news consumption, not online falsehoods, Mr Dan Shefet, a lawyer and IT expert said.

SINGAPORE - The real threat to democracy is not deliberate online falsehood, but selective news consumption, an European lawyer and IT expert said on Wednesday (March 28).

"The overreaching problem is that the news I get is not the news you get and so on," Mr Dan Shefet noted in his written submission to the Select Committee on deliberate online falsehoods.

Algorithms used by technology companies such as Google and Facebook track users' online activities and preferences to sort them into groups for targeted advertising, he noted. These algorithms also determine what articles and sources show up on a person's news feed, which in turn creates bubbles and echo chambers.

"That is very dangerous in a democracy. Whether it's fake or not is not the point. We don't have the same information," he warned the committee. "This amounts to psychological manipulation."

Mr Shefet, who is also a consultant to Unesco, said this problem should be addressed in the run-up to major political events such as an election.

"It's a question of suspending the echo chambers for five weeks before an election so every one has the same news," he said.

Mr Shefet noted in his submission that some may say this is like making it compulsory for all news stands "to carry the same newspapers all over the country prior to elections", but did not give more details on how this could be implemented.

 
 
 

He also stressed that it was important for governments to regulate tech companies and the online space, perhaps through appointing an Internet ombudsman which can offer guidance to Internet platform providers and search engine operators on what is unacceptable content.

Tech companies, when asked to better police the content on their platforms, often argue that they are infrastructure providers and not content producers, and are not in the position to judge what should be taken down, Mr Shefet noted.

"So what we need here is an institution that will give guidance to the social media companies and search engines and so on. And that can be done very quickly if the guidance is not a legally binding decision. That judicial oversight could be almost real time."

Mr Shefet also raised another concern - behavioural psychology research which, in the wrong hands, could be used to manipulate people on a large scale.

He cited ongoing research being conducted by Stanford University on the 60 "persuasion points" that each person is vulnerable to.

"If I had the data, I could persuade you to do anything I want you to do. It's dangerous to have this kind of science, this kind of research, if it falls in the wrong hands, as we saw with Cambridge Analytica," he said.

Mr Shefet was referring to how the British data-mining firm was reported to have used Facebook to gather the private information of tens of millions of Americans, to help United States President Donald Trump during the 2016 US presidential election.

Public hearings to fight online falsehoods: Read the submissions here and watch more videos.