Skip to content
Skip to main content

Fake news, filter bubbles and Facebook

Updated Wednesday, 4 January 2017
Who's to blame for the rapid spread of fake news in social media site Facebook?

This page was published over 7 years ago. Please be aware that due to the passage of time, the information provided on this page may be out of date or otherwise inaccurate, and any views or opinions expressed may no longer be relevant. Some technical elements such as audio-visual and interactive media may no longer work. For more detail, see how we deal with older content.

Following the shock results of Brexit and the Donald Trump victory in the 2016 US presidential election, a lot of attention has focused on the role that the social media site Facebook might have played in creating online political ghettos in which fake news can easily spread. Facebook now has serious political influence, given its development from a place used mainly for making social connections into one where news and opinions are consumed and shared. And for many people, the way it manages this influence is in need of greater scrutiny. But to put the blame solely on the company is to overlook how people use the site, how they themselves create a filter bubble effect through their actions, and how education, not regulation, can help combat the issue.

What are ‘filter bubbles’ and who’s to blame

Much of the debate around Facebook’s influence on modern politics has focused on the design of the site itself. Facebook’s personalisation algorithm, which is programmed to create a positive user experience, feeds people what they want, and this creates what Eli Pariser calls ‘filter bubbles’, which supposedly shield users from views they disagree with. With people increasingly turning to Facebook for their news – 44 % of US adults now report getting news from the site – and the fact that fake news is not editorially weeded out, this means that misinformation can spread easily and quickly, hampering the chance people have for making informed decisions.

Following the Trump victory there have been frequent calls for the company to address this issue. But much of the debate has had an element of technological determinism to it, suggesting that users are at the mercy of the algorithm. Whereas in fact, research shows that the actions of users themselves are still a very important element in the way that Facebook gets used.

people looking at their phones

Research conducted at the Open University has been looking specifically at how people’s actions create the context of the space in which they communicate. Of equal importance to the influence of the algorithm is what people do with the site, and how they themselves fashion their experience of it. An overwhelming attitude from people surveyed in the research was that Facebook is not ideally suited to political debate, and that things should be kept trivial and light-hearted. This isn’t to say that the expression of political opinion doesn’t happen, but for many people there’s a reluctance to engage in discussion, and a sense that anything that might be contentious is better handled face-to-face. People report that they worry the online context will lead to misunderstandings because of the way that written communication lacks some of the non-linguistic cues of spoken communication, such as tone of voice and facial gestures.

There’s strong evidence, both from this research and elsewhere, that people are exposed to a great deal of diversity because their network includes people from all parts of their life). In this respect, the algorithm doesn’t have a marked influence on the creation of filter bubbles. However, because they often want to avoid conflict, people report ignoring or blocking posts, or even defriending people, when confronted with views with which they strongly disagree. They also report taking care of what they say themselves so as not to antagonise people such as family members or work colleagues whose views differ from theirs, but whose friendship they wish to maintain. And finally they talk of making a particular effort to put forward a positive persona on social media, which again stops them from engaging in debate which might lead to argument.

The idea that algorithms are responsible for filter bubbles suggests it should be easy to fix (by getting rid of the algorithms), which makes it an appealing explanation. But what this perspective ignores is the part played by users themselves, who effectively create something akin to their own filter bubbles.

This isn’t done with the intention of sifting out diversity but is instead due to a complex mix of factors. These include the perceived purpose of Facebook, how users want to present themselves in an effectively public form, and how responsible they feel for the diverse ties that make up their online network.

The fact that manipulation by the algorithm isn’t the only issue here means that other solutions, for example raising people’s awareness of the possible consequences that their online actions have, can help encourage debate. We have to recognise that the impact of technology comes not just from the innovations themselves but also from how we use them, and that solutions have to come from us as well.

‘Fake news’ and what can be done about it

A few days after Trump’s victory the BBC ran a quiz to see if readers could identify fake news stories from real ones. They asked, for example, which of these widely-reported stories was an actual event:

  1. Putin issues international arrest warrant for George Soros
  2. Black Lives Matter thug protests President Trump with selfie... accidentally shoots himself in the face
  3. Passenger allowed onto flight after security confiscate his bomb

If you’re having trouble picking the correct answer this is because, based only on the headlines, it’s virtually impossible to differentiate the true from the false. They all look equally implausible – but then headlines are meant to grab the attention by highlighting the striking or unusual. (Answer 3 is, apparently, the real story.) We evaluate stories not simply on their plausibility but based on a complex mixture of past experience, knowledge of context, authority of the source, and our own beliefs and values. As Evan Selinger and Brett Frischmann write, ‘Knowing real news from fake news, discerning fact from opinion and opinion from propaganda — these are learned skills’.

Along with the influence of filter bubbles, fake news - and the role this might have played in paving the way for Trump’s victory – has been another major talking point. Everyone from the pope and Hillary Clinton has weighed in on the issue.

Facebook logo While certain sections of the news media have always pedalled fabricated or exaggerated news, what’s different in the contemporary media landscape is the ease with which people can scan a headline and share it across their network. Again, the debate has centred almost exclusively on how the technology itself needs to be refined to combat the problem. Although there are certainly issues that tech companies should be monitoring – such as the way that Google’s search engine algorithm can be gamed by extremists – pinning the problem solely on the technology implies that the population at large is rather credulous, that they have little knowledge of how media works and are liable to believe anything they’re told. But the ability to make informed decisions isn’t dependent solely on the information one is fed. It depends on being able to evaluate that information, understand its provenance, and appreciate that the way it’s mediated has an influence on the nature of the information itself.

There are people who seem to uncritically believe whatever they read, of course. The ‘Pizzagate’ incident, when a man with an assault rifle fired shots in a Washington DC pizzeria, apparently influenced by a fake news story about Hillary Clinton and other Democrats running a child sex ring at the address, unsurprisingly attracted a lot of press coverage. However, there is also evidence that many people take a more sceptical approach to what gets passed around on the internet, but enjoy consuming it anyway. For Michael Lynch (University of Connecticut), this points to a crisis of belief in the media. He identifies a common attitude that if ‘[t]here’s no way for me to know what is objectively true… we’ll stick to our guns and our own evidence. We’ll ignore the facts because nobody knows what’s really true anyway’.

Here again, it’s important to consider how people actually interact on Facebook, and how this is shaped by their attempts to manage this challenging social situation. For example, they need to find ways of presenting themselves which will not offend their friends while simultaneously maintaining various other relationships (with work colleagues, relatives and so on). Importantly, one implication of this is that people will avoid challenging or debating opinions with which they disagree, so as to avoid creating personally awkward public conflicts. Altering Facebook’s algorithm, or policing people’s use of the site, will not address the fact that sharing information on Facebook is fundamentally a social activity, and the impact this has on the way people consume and evaluate information.

Instead, what’s important is digital critical literacy: an understanding not just of how technology works, but how it works socially. These are learnt skills, not something you simply intuit. Just as programmes around internet safety awareness are important at primary and secondary school for understanding the dangers that online communication can involve, so too education around how communications media operate, and the implications this has for how people share, process and consume information, can have great benefit at HE level for preparing students to become critically-engaged citizens. Many such skills are learnt within degree programmes as ‘study skills’ – to sift, evaluate, and authenticate information; to shape communication to a particular audience. But they can also be applied to broader contexts, and used to raise awareness of how the flow of information in society as a whole is managed – and the implications this can have for the maintenance of an effective society.

A different version of this article by the same authors was published on The Conversation

 

Become an OU student

Author

Ratings & Comments

Share this free course

Copyright information

Skip Rate and Review

For further information, take a look at our frequently asked questions which may give you the support you need.

Have a question?