Hold On!

Hold Up

Please select a minimum of three sectors in the menu above.

Got It

The way information spreads from person to person – whether via Chinese whispers or clickbait headlines – means that the truth doesn’t always get that far. And in the wake of the polarising decisions that were made at the hands of democracy over 2016, this in itself is an inconvenient truth.

The internet has enabled the truth to be twisted beyond recognition. Echo chambers (or ‘filter bubbles’) have been created by the algorithms intended to fill your social feeds with content that’s more relevant to you, blocking out any opinions that differ from your own. And in response, brands have been quick to cover their backs, and demonstrate efforts to fight fake news. Facebook has announced the Facebook Journalism Project, through which it will collaborate with media outlets to ensure the integrity of the news shared on the platform, while Channel 4 has arranged ‘Fake News Week’ – a week-long season of programming this month that'll explore where fake news comes from and how it spreads.

The most important factor in this equation, however, is neither social platforms nor media outlets – it’s the people that populate and consume them respectively. These feeds are only filled with more of what people have already seen (and probably liked); these articles wouldn’t spread if people weren’t clicking the share button. “People share articles to make sure they’re seen as part of the in-group,” explains internet psychologist Graham Jones in an interview with Canvas8, “but it’s also to confirm their sense of self. People like to share things and do things that confirm they are who they think they are. If we share something and other people like what we’ve shared, our sense of self carries on being established.”

But while it might figure that if someone’s hinging their personal brand on an article they’re sharing, they’d want to make sure it’s on the money first, in reality that’s not the case. As many as 59% of people share articles they haven’t read. “Our brains are constantly trying to do things with the least amount of energy so that we can use that energy if we encounter any kind of threat or risk,” says Jones. “In the context of political news, people will see an article, read the headline, and maybe see a picture. That picture will confirm the bias that they want to see – if they’re anti-Trump, for example, and the picture is a particularly damning image of Trump, then they’ll immediately click on it because they agree with it, for example. They want to get that done quickly and move on to the next thing because their brain wants them to this with the least amount of effort. Prior to the internet, we couldn’t do it with that little effort. But now, with the plethora of sharing buttons and so on, people can share things more quickly.”

And unfortunately, research reveals that the rumours that spread the furthest and fastest are also often the false ones, with articles that are wrong being as much as 90% more likely to be shared. Much of this comes down to confirmation bias – the human tendency to seek out information that confirms what we already know – and is exacerbated in the context of a polarised political debate. “People tend to align themselves with one side of the argument or the other,” says Jones. “Then, when we add confirmation bias into the mix, people start looking for the other side’s argument so they can prove to themselves that their side of the argument must be right. Because of this, we end up with polarised views. 

Lore Oxford is Canvas8's deputy editor. She previously ran her own science and technology publication and was a columnist for Dazed and Confused. When she’s not busy analysing human behaviour, she can be found defending anything from selfie culture to the Kardashians from contemporary culture snobs.

08 Feb 17
3 min read

Next Article Previous Article