Facebook not sole culprit of filter bubbles - users at fault too

Following the shock results of Brexit and the Trump victory, a lot of attention has focused on the role that Facebook might have played in creating online political ghettos in which false news can easily spread.

Facebook now has serious political influence, thanks to its development from a social networking tool into a primary source of news and opinions. And for many, the way it manages this influence is in need of greater scrutiny. But to put the blame solely on the company is to overlook how people use the site and how they themselves create a filter-bubble effect through their actions.

Much of this debate has focused on the design of Facebook itself. The site's personalisation algorithm, which is programmed to create a positive user experience, feeds people what they want. This creates what the chief executive of viral content site Upworthy, Mr Eli Pariser, calls "filter bubbles", which supposedly shield users from views they disagree with.

People are increasingly turning to Facebook for their news - 44 per cent of American adults now report getting news from the site - and fake news is not editorially weeded out. This means that misinformation can spread easily and quickly, hampering the chance people have for making informed decisions.

Over the past few weeks, there have been frequent calls for Facebook to address this issue. United States President Barack Obama himself has weighed in on the issue, warning of the perils that rampant misinformation can have for the democratic process.

But much of the debate around this has had an element of technological determinism to it, suggesting that users of Facebook are at the mercy of the algorithm. In fact, our research shows that the actions of users themselves are still a very important element in the way that Facebook gets used.

Our research has been looking specifically at how people's actions create the context of the space in which they communicate. Just as important as the algorithm is how people use the site and shape it around their own communication. We have found that most users have an overwhelming view that Facebook is not ideally suited to political debate and that posts and interactions should be kept trivial and light-hearted.

This is not to say that people do not express political opinions on Facebook. But for many people, there is a reluctance to engage in discussion and a sense that anything that might be contentious is better handled by face-to-face conversation. People report that they fear the online context will lead to misunderstandings because of the way that written communication lacks some of the non-linguistic cues of spoken communication, such as tone of voice and facial gestures.

There is strong evidence in our research that people are actually exposed to a great deal of diversity through Facebook. This is because their network includes people from all parts of their life, a finding that echoes other research. In this respect, the algorithm does not have a marked influence on the creation of filter bubbles. But because people often want to avoid conflict, they report ignoring or blocking posts, or even "unfriending" people, when confronted with views which they strongly disagree with.

They also report taking care of what they say themselves so as not to antagonise people - such as family members or work colleagues - whose views differ from theirs, but whose friendship they wish to maintain. And finally, they talk of making a particular effort to put forward a positive persona on social media, which again stops them from engaging in debate which might lead to argument.

NOT SO EASY TO FIX

The idea that algorithms are responsible for filter bubbles suggests it should be easy to fix (by getting rid of the algorithms), which makes it an appealing explanation. But this perspective ignores the part played by users themselves, who effectively create their own filter bubbles by withdrawing from political discussions and hiding opinions they disagree with.

This is not done with the intention of sifting out diversity but is instead because of a complex mix of factors. These include the perceived purpose of Facebook, how users want to present themselves in an effectively public form, and how responsible they feel for the diverse ties that make up their online network.

The fact that manipulation by the algorithm is not the only issue here means that other solutions - for example, raising people's awareness of the possible consequences that their online actions have - can help encourage debate. We have to recognise that the impact of technology comes not just from the innovations themselves but also from how we use them, and that solutions have to come from us as well.

• Philip Seargeant is senior lecturer in applied linguistics and Caroline Tagg is lecturer in applied linguistics and English language at The Open University. This article first appeared in theconversation.com, a website of analysis from academics and researchers.

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Sunday Times on December 11, 2016, with the headline Facebook not sole culprit of filter bubbles - users at fault too. Subscribe