Technology

Bursting the bubble: in reality, most of us are exposed to a range of views online

Although we may follow like-minded people on Facebook and Twitter, on the whole, most people are subjected to a wide range of views on social media

April 20, 2018
A shared interest in sports is just one way we encounter people with different political views. Photos: PA
A shared interest in sports is just one way we encounter people with different political views. Photos: PA

It is easy to believe that people who don’t agree with you are living in a bubble. When you understand the history and background to a problem—be it Brexit, gun control or the (lack of) connection between autism and vaccines—and the other side’s arguments build on ignorance and hyperbole, it can appear that those who don’t agree with you are deluded. Trapped in an echo chamber of their own failed ideas, reconfirmed online by others with that same misplaced sentiment, your opponents’ arguments create self-affirming support for themselves.

This ‘echo chamber’ hypothesis has been around for a long time and has recently been supplemented by the idea of a ‘filter bubble’: the idea that the algorithms used by Facebook, Google and Twitter to provide personalised news end up trapping us inside our narrow worldview.

The advantage of being an academic is that when I hear a theory or idea, I have the time and resources to test it carefully myself. So, when I heard about the ‘filter bubble’ hypothesis, I decided to do exactly that. Well… that’s not entirely true. What I actually did was get a master’s student to help me test it.

Joakim Johansson downloaded data about the people who followed the UK’s leading newspapers on Twitter (this is entirely legal and, indeed, encouraged by the social media site). He then looked at how these people were connected, starting with an analysis of me.

There is a very distinctive structure to my Twitter network. I am part of a cluster of scientists who follow each other. This group is the best type of echo chamber: we reaffirm each other, moan about the scientific funding situation and share the latest gossip. Joakim found that we are also similar in the political information we access, following newspapers—like the Guardian and the Financial Times—that were pro-Remain in the UK’s referendum on membership of the EU. We are part of a pro-Remain academic bubble.

I am not trapped, though. Other links in my network are more spread out, reaching people who don’t know each other, but do know me. It is here that my less academic pastime comes in to play: football. I also use Twitter to talk to fellow football nerds about the game, and the maths and stats used to analyse it. My choice of who to follow about football is more random than in science. Sometimes I’ll share a few tweets back and forth about a match or a player, enjoy the conversation and decide to follow the user I’ve been talking to. As a result, I end up interacting with a more varied group of people, who follow newspapers that were both pro-remain and pro-leave—like the Times and the Daily Mail.

I can hardly be considered representative, so Joakim created networks for hundreds of different users, each of whom followed at least one UK newspaper. Their social networks typically consisted of one or two tight clusters of friends who mainly followed each other. Sometimes the users in these clusters would all be close to pro-remain newspapers, other times they would be pro-leave. But in addition to these more partisan groupings, there were always isolated branches out to users with very different political views.

Our small-scale study of Twitter in 2017 cannot be considered comprehensive. But evidence from Facebook supports the same conclusion. In a study of ten million users in 2015, Facebook researchers showed that people on their social network are regularly exposed to views they don’t agree with. Conservatives are exposed to articles from liberal newspapers and, to a slightly lesser degree, liberals see conservative news.

Our friendships on Facebook, which range from childhood friends to current day work colleagues, are not formed on partisan bases and, as a result, expose us to diverse ideas.

There are exceptions to the rule. Echo chambers do exist on Facebook and Youtube around pseudo-scientific issues, like the link between vaccines and autism. Another example is the chemtrails conspiracy, which is the theory that the government is poisoning us using the vapour trails left by aeroplanes. Small communities build up around sharing and discussing these untruths.

On the whole, however, most people are subjected to a wide range of views via social media. And if conspiracy bubbles become too large then people on the outside challenge them. (Just read the comments section of a chemtrail video and see.)

Echo chambers and filter bubbles are---like fake news, automated troll bots and Cambridge Analytica’s personalized political advertising---concepts that shape our consciousness about social media. They are certainly useful, but they are far from universally applicable. I have spent the last year looking into the algorithms that control our lives: conducting experiments of my own, reading the scientific literature and talking to the algorithms’ creators. I found that much of the criticism is wrong. It is unlikely, for example, that either fake news or Cambridge Analytica won Trump the US election. Other problems, such as algorithmic discrimination and bias, are very real and important.

Before we draw conclusions, we need to properly understand how the algorithms that underlie social media work and the effects they do (and don’t) have.

Outnumbered: From Facebook and Google to fake news and filter-bubbles – the algorithms that control our lives by David Sumpter is published Bloomsbury, £16.99