Verica Rupar and Tom De Smedt
20 Jan 2021
“The lie outlasts the liar,” writes historian Timothy Snyder, referring to outgoing president Donald Trump and his contribution to the “post-truth” era in the US.
Indeed, the mass rejection of reason that erupted in a political mob storming Capitol Hill mere weeks before the inauguration of Joe Biden tests our ability to comprehend contemporary American politics and its emerging forms of extremism.
Much has been written about Trump’s role in spreading misinformation and the media failures that enabled him. His contribution to fuelling extremism, flirting with the political fringe, supporting conspiracy theories and, most of all, Twitter demagogy created an environment in which he has been seen as an “accelerant” in his own right.
If the scale of international damage is yet to be calculated, there is something we can measure right now.
In September last year, the London-based Media Diversity Institute (MDI) asked us to design a research project that would systematically track the extent to which US-originated conspiracy theory group QAnon had spread to Europe.
Titled QAnon 2: spreading conspiracy theories on Twitter, the research is part of the international Get the Trolls Out! (GTTO) project, focusing on religious discrimination and intolerance.
GTTO media monitors had earlier noted the rise of QAnon support among Twitter users in Europe and were expecting a further surge of derogatory talk ahead of the 2020 US presidential election.
We examined the role religion played in spreading conspiracy theories, the most common topics of tweets, and what social groups were most active in spreading QAnon ideas.
We focused on Twitter because its increasing use — some sources estimate 330 million people used Twitter monthly in 2020 — has made it a powerful political communication tool. It has given politicians such as Trump the opportunity to promote, facilitate and mobilise social groups on an unprecedented scale.
Using AI tools developed by data company Textgain, we analysed about half-a-million Twitter messages related to QAnon to identify major trends.
By observing how hashtags were combined in messages, we examined the network structure of QAnon users posting in English, German, French, Dutch, Italian and Spanish. Researchers identified about 3,000 different hashtags related to QAnon used by 1,250 Twitter profiles.
Every fourth QAnon tweet originated in the US (300). Far behind were tweets from other countries: Canada (30), Germany (25), Australia (20), the United Kingdom (20), the Netherlands (15), France (15), Italy (10), Spain (10) and others.
We examined QAnon profiles that share each other’s content, Trump tweets and YouTube videos, and found over 90% of these profiles shared the content of at least one other identified profile.
Seven main topics were identified: support for Trump, support for EU-based nationalism, support for QAnon, deep state conspiracies, coronavirus conspiracies, religious conspiracies and political extremism.
Hashtags rooted in US evangelicalism sometimes portrayed Trump as Jesus, as a superhero, or clad in medieval armour, with underlying Biblical references to a coming apocalypse in which he will defeat the forces of evil.
Overall, the coronavirus pandemic appears to function as an important conduit for all such messaging, with QAnon acting as a rallying flag for discontent among far-right European movements.
We used Textgain’s hate-speech detection tools to assess toxicity. Tweets written in English had a high level of antisemitism. In particular, they targeted public figures such as Jewish-American billionaire investor and philanthropist George Soros, or revived old conspiracies about secret Jewish plots for world domination. Soros was also a popular target in other languages.
We also found a highly polarised debate around the coronavirus public health measures employed in Germany, often using Third Reich rhetoric.
New language to express negative sentiments was coined and then adopted by others — in particular, pejorative terms for face masks and slurs directed at political leaders and others who wore masks.
Accompanying memes ridiculed political leaders, displaying them as alien reptilian overlords or antagonists from popular movies, such as Star Wars Sith Lords and the cyborg from The Terminator.
Most of the QAnon profiles tap into the same sources of information: Trump tweets, YouTube disinformation videos and each other’s tweets. It forms a mutually reinforcing confirmation bias — the tendency to search for, interpret, favour, and recall information that confirms prior beliefs or values.
Harvesting discontent has always been a powerful political tool. In a digital world this is more true than ever.
By mid 2020, Donald Trump had six times more followers on Twitter than when he was elected. Until he was suspended from the platform, his daily barrage of tweets found a ready audience in ultra-right groups in the US who helped his misinformation and inflammatory rhetoric jump the Atlantic to Europe.
Social media platforms have since attempted to reduce the spread of QAnon. In July 2020, Twitter suspended 7,000 QAnon-related accounts. In August, Facebook deleted over 790 groups and restricted the accounts of hundreds of others, along with thousands of Instagram accounts.
In January this year, all Trump’s social media accounts were either banned or restricted. Twitter suspended 70,000 accounts that share QAnon content at scale.
But further Textgain analysis of 50,000 QAnon tweets posted in December and January showed toxicity had almost doubled, including 750 tweets inciting political violence and 500 inciting violence against Jewish people.
Those tweets were being systematically removed by Twitter. But calls for violence ahead of the January 20 inauguration continued to proliferate, Trump’s QAnon supporters appearing as committed and vocal as ever.
The challenge for both the Biden administration and the social media platforms themselves is clear. But our analysis suggests any solution will require a coordinated international effort.
This article was written by Verica Rupar, Professor, Auckland University of Technology and Tom De Smedt, Postdoctoral research associate, University of Antwerp.
This article was first published by The Conversation.