introduction image

AI-generated image of Donald J. Trump, the former President of the United States, 'crowned as the king of the world'.

Picture by: DeepAI Image Generator

Article link copied.

‘Misinformation is now produced on an industrial scale’ – Concerns over propaganda on social media

17 year-old Cressida Anness Lorenz interviews history student, Gabrielle Segal, on misinformation and propaganda

Throughout history, propaganda has been used as a weapon of misinformation to promote biased political causes or perspectives.

Its use dates back to classical periods, with iconography such as the Roman statue Prima Porta being used to promote the reign of Emperor Augustus in the first century BC.

In the modern world, its manifestations have adapted to the new age of the internet and can be recognised in concepts such as disinformation and fake news, terms that are becoming increasingly common across the globe.

The development of artificial intelligence (AI) technologies has provided another way to spread misinformation and fake news on an even wider scale.

The rise in AI-generated content makes it easier to create and manipulate information online, including for political purposes.

A 2023 report by Freedom House, a human rights advocacy group, found that AI was used in 16 countries to ‘sow doubt, smear opponents, or influence public debate’.

Many well-known figures have also been involved in fake news campaigns and disinformation. A Russian ‘propaganda push’ on Cameo tricked US actors into creating videos spreading falsehoods about Volodymyr Zelensky, the Ukrainian President.

Such examples of fake news and misinformation in the political sphere are commonly referred to as computational propaganda, or the use of political manipulation that takes place over the internet. Deepfake algorithms, software capable of creating high-level fake content, have been used to create misleading videos, an example being a widely shared video of US President Biden supposedly making transphobic comments, which was later revealed to be false.

Other targets have included Stephen Fry, who spoke out in 2023 against the use of a deep learning algorithm to replicate his voice for a historical documentary, all without his knowledge or consent.

Youth and children more susceptible to fake news

Due to the importance of social media and communication to young people’s lives, concerns have been raised about the vulnerability of young people to misinformation and propaganda content.

A 2023 study by Ofcom (the UK communications regulator) revealed that a third of children believed all or most of what they saw on social media to be accurate and true. It also found that three in ten children aged 12–15 used TikTok as a news source in 2022.

image

History student Gabrielle Segal, 18: It is difficult to know whether a news source is fake or biassed, the development of the internet has made the line more blurry.

Picture courtesy: Gabrielle Segal’s personal archive

Gabrielle Segal, aged 18, from the UK, shared her experience of misinformation on social media with Harbingers’. Gabrielle says “it is difficult to know” whether a news source is fake or biased.

As a history student, she notes how past examples of biased information are much clearer to see, but the development of the internet has made the line more blurry.

Despite age restrictions on certain social media sites, it is well known that many young people below the age limit use these sites and therefore may come across misinformation.

Tech giant Meta, the owner of Instagram, Facebook, Threads and WhatsApp, has been accused of allowing millions of underage users on its sites; this has been described as an ‘open secret’ within the company.

Global events and misinformation

Using online spaces for eight years, Gabrielle has noticed subtle changes online towards more political topics. She feels the reason for this could be the global COVID-19 pandemic as well as the more recent conflicts in Russia/Ukraine and Israel/Gaza.

She is not alone in this line of thinking, with experts working in the field of computational propaganda sharing how political misinformation has only increased in recent years.

Read more:

Social media manipulation by political actors an industrial scale problem - Oxford report

Professor Phillip Howard, director of the UK”s Oxford Internet Institute (OII), in a report on social media manipulation and politics, said that “misinformation has become more professionalised and is now produced on an industrial scale”.

The research team warns that social media manipulation has rapidly risen due to political parties and governments commissioning so-called ‘cyber troops’ to spread their ideologies.

The use of bots, fake accounts and sophisticated software also allows groups and individuals to spread propaganda while remaining anonymous.

Samuel Woolley, program director for computational propaganda research at the Center for Media Engagement at the University of Texas, explained that the use of increasingly sophisticated bots can “amplify and suppress particular content online,” creating disinformation. Woolley said bots are used to generate attention for those they support – and to “mobilise smear campaigns against those they oppose.”

An investigation by The Guardian revealed how a group known as ‘Team Jorge’ used fake social media profiles to manipulate more than 30 elections around the world, including in the UK, US and Germany, by ‘hacking, [sabotaging] and [automating] disinformation on social media’.

A significant rise in disinformation has also been seen during the ongoing Israel-Gaza war. During the early days of the conflict, a barrage of images, testimonials and information was released, which made it ‘difficult to assess what is real’, according to The Washington Post.

A Plague Constant: Disease and Conspiracy in the Early Modern Period and Today

A similar trend was evident in the online discourse surrounding the initial spread of the COVID-19 virus.

Dr Nahema Marchal, a former research associate at the OII, gave a seminar analysing popular search terms during the pandemic. The phrase “coronavirus conspiracy” was among the top ten search terms, according to data collected on 10 March 2020.

Misinformation can also have negative consequences by causing fear among the general public. A 2021 study on the impact of fake news on health concluded that excessive unreliable information about the virus had the potential to “cause psychological disorders and panic, fear, depression and fatigue”. The same study highlighted that platforms such as Twitter, Facebook, Instagram and Wikipedia were increasingly used to “obtain information about the course of the disease”.

Other experts looking at the impact of misinformation during the pandemic explained how ‘the social media panic travelled faster than the COVID-19 outbreak’.

Free speech or censorship?

When it comes to regulating online spaces, Eva Carrillo Roas, a student from King’s College London, discussed how the right to speak freely does not mean people should be able to share lies. But Roas also stressed that censorship could polarise people even further.

Gabrielle added that “it’s pretty much impossible” for social media platforms to control misinformation. Discussing the importance of freedom of speech, she said that if a social media platform decides what you can say, it can get “very messy”, and it is even more difficult to draw the line between correct speech and what breaks the rules.

In a research project for Stanford Law School, US attorney Marie-Andrée Weiss says it is a “difficult exercise” to strike a balance between free speech and censorship, and deciding what speech is worthy of protection is a “slippery slope.”

Public opinion also remains split on the regulation of online spaces.

A study from the US’s Pew Research Center, published in 2022, found that 38% of adults believed social media companies using algorithms to find false information is a good idea for society, while 31% labelled the idea as bad.

Professor Howard urged social media companies to address people’s concerns. Outlining a potential way to to balance the issue of regulation and freedom of speech, he says that “social media companies need to raise their game by increasing their efforts to flag misinformation and close fake accounts without the need for government intervention.”

Ways to combat misinformation

There have been discussions about how new technologies can aid in combating fake news. An analysis by Fatima Carrilho Santos in 2023 explored the different ways in which AI can be a powerful tool to analyse fake news, including “fact-checking, linguistic analysis, sentiment analysis, and the utilisation of human-in-the-loop systems.” However, the study also acknowledges the importance of human interaction in fighting misinformation.

It can be difficult to discern false information from fact, especially if the social media platform is being used for recreational purposes. But, Gabrielle warns social media users to be cautious of the information that they consume.

She told me that “not all of social media is inaccurate. Most of it is an opinion, but I wouldn’t want to use it as a primary source of information,” adding that those particularly concerned about a specific topic or fact should try to do more research or use fact-checking websites to verify any claims.

Organisations across the world, including news media, have been susceptible to misinformation, and have set up fact-checking teams to help combat this issue.

Examples include BBC Verify, Reuters Fact Check and FactCheck.org. Here is a simple video from the BBC explaining how to spot fake news:

 

Written by:

author_bio

Cressida Anness Lorenz

International Affairs editor

London, United Kingdom

Hailing from Islington, London, Cressida was born in 2006 and has been interested in creative writing and journalism from a young age. She joined Harbingers’ Magazine as one of the winners of the Harbinger Prize 2023, and in 2024 became the International Affairs editor for the magazine.

An abstract thinker, her main areas of focus are varied and philosophical in nature. In her spare time she enjoys involving herself in the art world, attending numerous practical art groups. This involvement in art has led to a curiosity in perspective and how it can be used as a lens to see the world in many different ways.

She enjoys both reading and writing which are her main pastimes and aims to study law.

Edited by:

author_bio

Christian Yeung

Society editor

Hong Kong | United States

society

Create an account to continue reading

A free account will allow you to bookmark your favourite articles and submit an entry to the Harbinger Prize 2024.

You can also sign up for the Harbingers’ Weekly Brief newsletter.

Login/Register