Sceptical about new media? Good for you

If your social media news feed was clogged with outraged posts after United States President-elect Donald Trump won the election, remind yourself that it is a filtered feed.

Call it a "spoon-feed".

We delete what we don't want to see, we "like" what we like. The power of personalisation is great, but as we scrub our online worlds clean of annoying posts and people, we pay a price for less diversity.

This is especially since social media and browsers press their ears to the door of our data, and use it to tailor-make news that we are likely to like, like, like…

Social media feeds - which may contain news items from media companies - are not offering us news the way legacy media does. But they have come to rival legacy media as a source of news.

Even if some say otherwise.

Having information tailored for us makes us feel in control, but it seems that we are, at the same time, giving up control over what we eat, and whom we might love and fight for. And does it nudge us towards being unable to tolerate a contrary view?

Facebook chief executive Mark Zuckerberg said that it's "odd" to say his company is a media company, after criticism that fake and inaccurate news on the social media site may have helped decide the shock outcome in the US elections.

He wrote: "Remember that Facebook is mostly about helping people stay connected with friends and family. News and media are not the primary things people do on Facebook, so I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance."

So let's call social media companies whatever will remind us of the filtering they do: The Echo Chronicle or The Ownself Post.

Last week, Facebook and Google said they will no longer allow fake news sites to use the tech giants' ad-selling services. But, even if fake news fades into the background, are we still in echo chambers, with uncomfortable truths silenced to the detriment of healthy debate? Is just being aware that we are floating in filter bubbles enough for us to see the world more clearly? Will it help stop us from becoming pawns of populists during elections?

German Chancellor Angela Merkel, who may run for re-election next year, said recently that the way search engines and social networks like Google and Facebook choose what people see online should be made public. She believes that the secrecy around the algorithms used by online platforms threatens open debate.

 

Algorithms are the formulas used by a search engine to steer a request for information. They are different for every search engine and determine web page rankings.

The Guardian said that Germany's established parties share the concern about seeing a repeat in their country of the narrow debate which affected the US presidential campaign. A cross-party working group is compiling recommendations urging more openness by Internet platforms, including giving details of how algorithms collect, choose, evaluate and present information to users. Its recommendations will be sent to Brussels, for the European Union digital commissioner to work them into guidelines by next year.

Filter bubbles seem harmless when we think we can see through them and pop them by listening to the other side. But, if we use online tools for many bits of our lives, we might already be in too deep.

A recent article by The Economist's 1843 magazine said: "The news we see, the friends we hear from, the jobs we hear about, the restaurants we consider, even our potential romantic partners - all of them are, increasingly, filtered through a few widespread apps, each of which comes with a menu of options. That gives the menu designer enormous power. As any restaurateur, croupier or marketer can tell you, options can be tilted to influence choices."

Having information tailored for us makes us feel in control, but it seems that we are, at the same time, giving up control over what we eat, and whom we might love and fight for. And does it nudge us towards being unable to tolerate a contrary view?

Over a meal of prawn noodles one day, I brought up the topic of the US election, only to find that the other person knew all about then-candidate Hillary Clinton's dark side, but had almost no idea about Mr Trump's problems - rants about Muslims, bankruptcies, and so on. When I listed a few of them, the person looked at me like my head had turned into a prawn. If Mr Trump were sitting nearby, he would have leaned towards a metaphorical microphone and said: "Liar."

Social media news feeds can give each person such different streams of information. Call it an "I'm right, you're wrong" feed.

American Internet activist Eli Pariser said of the US media scene: "In 1915, it's not like newspapers were sweating a lot about their civic responsibilities. Then, people noticed that they were doing something really important. That, in fact, you couldn't have a functioning democracy if citizens didn't get a good flow of information, that the newspapers were critical because they were acting as the filter, and then journalistic ethics developed… Now, we're kind of back in 1915 on the Web. And we need the new gatekeepers to encode that kind of responsibility into the code that they're writing."

The fight against fake news is a positive step, and we would like to find out more about those algorithms, please. While we wait for the kind of responsibility that Mr Pariser speaks of to develop, we could do something for ourselves first. If you are already sceptical about whether legacy media here is telling you the truth, good for you. Take that healthy scepticism and apply it to new media too.

Get good things from dear old Google and enjoy the news feed on friendly Facebook - just remember they're giving you what you want.

Call it a "self feed".

SEE INSIGHT: Friction over fiction on Facebook

A version of this article appeared in the print edition of The Sunday Times on November 20, 2016, with the headline 'Sceptical about new media? Good for you'. Print Edition | Subscribe