Why a solution to “fake news” depends entirely on what you think the problem is

Alice Thwaite on 2016-12-06

A pretty cool echo-chamber

There has been a lot of talk in the media recently about echo-chambers. This concern has mainly been directed against ‘fake’ or ‘fraudulent’ news stories that may have increased support for Donald Trump. They believe a lack of accountability on social media led to widespread fear and misinformation which resulted in Trump’s election.

“Fake news” is part of a wider problem concerning how the media should be organised in a digital world. I spend time looking at echo-chambers — so I’d like to broaden the discussion here to news which you are inclined to believe, because it affirms the world-view that you already have.

Here I’d just like to go through what you might believe the problems with echo-chambers are, to show how complicated the problem is…

Self-affirmation leads to echo-chambers leads to “fake news”

An echo-chamber is a digital phenomenon. Whilst in an echo-chamber you only hear opinions and analysis that agrees with your own point of view. This occurs because of personalised algorithms. Search engines and social media platforms recognise themes of the articles you engage with, and in return push back information that you will engage with again.

You shout out an opinion, and the chamber repeats it back to you. Like an echo. The name speaks for itself.

They wouldn’t be a problem if it wasn’t for human behaviour — typically people spend the majority of their time in places where an echo-chamber is prevalent. That is, on social networks.

Echo-chambers are also referred to as ‘filter bubbles’. This terminology acknowledges that articles, opinions and analysis are fed to you via a filtering mechanism. Given you cannot read all the information published on the web, you have to have some system for filtering which information you read and which you ignore.

Previously, a person was more in control of their filters. They could take a look at all the papers available in a newsagent, and choose which publication they would take the time to read. However, given the ‘fire-hose’ of content now produced online, even the ability to assess the mere titles of papers is too overwhelming. As a result, algorithmically induced filters are born. However, these algorithms remove an individual’s control.

Why is this a problem?

Echo-chambers are a problem for democracy, community and discovery. Those most at risk are the people who are unaware of their potential existence, and so are unaware their world-view is extremely narrow. If you live unknowingly in an echo-chamber, you may find alternate points of view extremely shocking, and difficult to accept. You will find it very difficult to build bridges with other groups inside your wider community, and society becomes polarised.

So let’s take a look at some reasons why you’ll find echo-chambers and therefore “fake news” concerning.

1. Freedom of Choice

2. Freedom of Thought

3. Lack of innovation

4. Lack of diversity

5. Low quality information

6. Information Blocking

7. Lack of public debate

There may be other categories — and I would be interested to hear about them — however, my primary intention is to show the diversity of objections to echo-chambers.

For if there is diversity in objections to fake news, then there will be different solutions to the problems caused by echo-chambers. For example, a solution that tries to ban fake news stories from advertising on Facebook differs entirely from one who lobbies for greater transparency from Facebook.

[On a side note, I believe algorithms or automative technologies that try to address the problems of echo-chambers will inevitably cover one or two of the above problems. It’s difficult to give someone freedom of choice whilst also deciding what a diverse set of information is. It may not be impossible — but I think product managers should directly work with ethicists and epistemologists when creating their solutions.]

The benefits of echo-chambers

Although we are worried about the effects of echo-chambers, it’s worth noting that they simply amplify common human behaviour.

Throughout history, people tend to congregate and be attracted to those who have similar values and beliefs. Communities have similar religious notions, common ethics, laws and celebrations. This type of homogeneity is crucial for the cooperation of the species.

Cooperation within large societies is one thing — but cooperation in smaller groups is also extremely important. In order for ideas to fully develop and progress, humans have to work together with people who want to solve the same problems.

If a physicist wanted to present a new paper, then he is unlikely to ask feedback from a biologist or a chemist. He is even less likely to solicit feedback from Joe Bloggs on the street. Instead, he is going to go to specific communities, who speak in a language that most find difficult to understand and to develop notions that are very much on the fringe.

I would argue this is similar to echo-chambers, however, it is an agreement that is entered transparently. Being around people with similar interests is positive, as small groups help niche subjects develop and progress. It is fringe work that creates innovation.

There are a couple of dissimilarities. The physicist is well aware of when she is in her private space, discussing her ideas with other physicists. She knows that other people and departments exist outside of her own group, who are working on different challenging problems. She is also loosely aware of the problems those people are working on, and may be able to have an intelligent conversation about it in the pub. She may not like to interact with those people, but she is still aware that she is sat inside a very niche bubble.

Compare this to a digital echo-chamber where the notions of private and public spaces are not well defined and many aren’t aware that they are in a personalised bubble. They believe they are able to access many different ideas from their browser. Yet still they don’t.

In summary, algorithms were created with good intent. Personalisation allowed people to pursue niche interests, and to easily find information that they may find engaging.

However, algorithms have amplified physical constructions to the extent that it is difficult to know when we are in a private space with our friends, or when we are in a public space with access to a multitude of ideas. What’s more, many don’t know how to access wider information. When we spend more time online, we spend more time in private spaces when we believe we are in public spaces.

Finally — is it right to ban fake news?

Thomas Baekdal wrote on Twitter about fake news vs spam…

So where do you draw the line here?

Alice Thwaite is the founder and editor-in-chief of The Echo Chamber Club — a weekly newsletter that challenges educated metropolitans to read views that differ from their own. Click here to subscribe.

Please tap or click “♥︎” to help to promote this piece to others.