Fake news will play a part in the US election again, it’s time to battle the spread of misinformation

Every day new conspiracy theories emerge, misinformation is a widespread problem that poses a great threat to our democracy

Raegan MacDonald
Sunday 01 November 2020 04:41 EST
Comments
Donald Trump in profile

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Imagine it’s 1938. You’re sitting at home listening to a radio show, when it gets abruptly interrupted by a newsflash: unusual phenomena have been registered on Mars and objects are moving towards the earth. A reporter is live on-site at an observatory and talks to an astronomer about the disturbing sightings. 

The show resumes before being interrupted again by more unsettling news: aliens have landed on earth. Outside your window, you see people running around, panicking, some calling the police. The following day a press conference reveals it was all a grand hoax. The radio broadcast, created by Orson Welles and based on the 1898 novel The War of the Worlds, is now cited as the first example of widespread misinformation and remains a compelling example of our collective vulnerability to it as well as the manipulative power of media in all its forms.

Unfortunately, since the internet, this is a problem that has only grown in intensity. Every day new conspiracy theories emerge, as questionable stories are shared on blogs and social media, sometimes by public figures. Where Welles’ fake broadcast was the exception, misinformation has become a pressing and widespread problem, contributing to a range of individual and collective harms. It’s time we got a handle on the issue.

A good place to start is to understand the difference between “misinformation” and “disinformation”. While both words describe information that is factually inaccurate, the European Commission defines “misinformation” as the information that the sharer believes to be true, where “disinformation” is “verifiably false or misleading information created, presented and disseminated for economic gain or to intentionally deceive the public”. 

It is on the back of the creation, dissemination and amplification of disinformation that the problem of misinformation arises, whereby individuals unwittingly consume and share content that has been designed to mislead and misinform. 

While we don’t have complete insight into the scale of the problem, studies have shown that Europeans interact with misinformation over 29 billion times a year. According to Statista, 75 per cent come across misinformation at least once a week, while 37 per cent face it daily. 

The result isn’t just that people are “badly informed”. Misinformation costs the global economy around $78 billion every year, according to a study by Israel-based cybersecurity firm CHEQ and the University of Baltimore, it threatens democracy and efficient governance, and it’s difficult to stop. In fact, Science Magazine has published empirical evidence that misinformation spreads significantly "faster, deeper, and more broadly" than factual news content. And now, amidst greater confusion and uncertainty brought by the pandemic, misinformation has been able to thrive more than ever. 

Social media networks, because they rely on algorithms to micro-target content designed to engage users, are important vectors for disinformation and misinformation. 

Social media services are designed in such a way that many features serve to present users with content that the service’s algorithmic recommender system considers to be engaging for a given user. The problem is that often, the kind of content that is most likely to engage a user is the type of content that shocks, misleads, angers, and frustrates. As such, by privileging the spread of content on the basis of its “engagement” factor, social media services can amplify disinformation. 

Disinformation is further enabled by the harmful data collection practices employed by these platforms. Having so much information about you (that goes far beyond your activities on the social network), it becomes easy for any entity to put promoted content in front of you with the goal to influence. This includes brand advertising, political advertising, but also disinformation that can be racist, homophobic, extremist or otherwise problematic.  

In the short term, there are steps we can take on an individual level to stop misinformation spreading. Readers can verify online information by confirming the credibility of the source, checking for similar coverage in other publications and “reverse image searching” to find other instances where images have appeared.

But the burden shouldn’t fall solely on the shoulders of the individual to protect themselves against disinformation and misinformation. That’s why, through our products, Mozilla has taken steps to counter disinformation and the structural factors that contribute to its impact through the web, by limiting cross-site tracking and the amount of data that can be used to micro-target disinformation when users surf the web – for instance through enhanced tracking protection built into the browser, and the Facebook Container.

We’ve also been pushing for systemic action to address the problem. For instance, we’ve recently called on Facebook to make political advertising on their platform more transparent and asked Twitter to pause their “trending topics” feature until after the 2020 US presidential election to prevent misinformation going viral in this highly sensitive context. We’ve also pushed for more effective public policy in this space, and were a founding signatory of the EU’s Code of Practice on Disinformation in 2018. The EU has a crucial opportunity to play an effective standard-setting role in addressing disinformation as it manifests in the online ecosystem, and we’re engaging heavily in the discussions around the proposed Digital Services Act and Democracy Action Plan to ensure that opportunity is not missed. 

It’s comforting to see the work being done, but we still have a long way to go to stop the spread of misinformation and rebuild consumer trust in online content. They say it takes a village to raise a child – likewise, it takes a critical mass of users to improve the internet. Let’s make it better and safer together.

Raegan MacDonald is the head of public policy at Mozilla 

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in