how Trump turned conspiracy-theory analysis the other way up
The iconic images of a man in a horned headdress roaming the U.S. Capitol during the January 6 riot came to a shock to people around the world. To Kate Starbird, the images were terrifyingly familiar. ‘QAnon Shaman’ – the online persona of Jacob Anthony Chansley or Jake Angeli – is a well-known superspreader of conspiracy theories that her research group has been monitoring for years.
The storming of the Capitol was “that physical manifestation of all these digital characters that we studied,” says Starbird, a social scientist at the University of Washington in Seattle who studies the spread of disinformation on social media. “Seeing all of this alive in real time was terrible, but not surprising.”
Starbird is part of a group of researchers in the US and abroad studying how disinformation and conspiracy theories take root and spread through social and mass media. As a U.S. President and prolific tweeter, Republican Donald Trump turned his research on its head when he helped bring typical fringe theories into the mainstream – most recently by downplaying the coronavirus pandemic and promoting the unsubstantiated claim that the U.S. presidential election would take place had been stolen from him.
With Trump out of office, this research group is now working to understand the flood of data they have collected on platforms like Twitter and Facebook. It was a lesson from modern populism: a world leader once expanded obscure conspiracy theories, with every tweet and retweet reinforcing the ideas and encouraging their followers. Now researchers are retooling to understand and prepare for what’s next.
During his presidency, Trump frequently retweeted followers linked to the infamous conspiracy theory, QAnon. This narrative originated in 2017 and claimed that a powerful cabal of Democrats and elites is trading and abusing children – and that Trump is fighting them. Although Trump never endorsed QAnon, he repeatedly refused to condemn conspiracy theory in interviews and once praised his supporters for their support.
One debate in the conspiracy theory research community is whether Trump pushed more people into QAnon or whether he only encouraged those who already believed. Polls suggest that QAnon supporters remain a small, albeit increasingly vocal minority, says Joseph Uscinski, a political scientist at the University of Miami in Florida who has followed public support for several years. Others argue that polls do not necessarily capture radicalization at the extremes.
QAnon has grown significantly in recent years under Trump, says Joan Donovan, a disinformation researcher at Harvard University in Cambridge, Massachusetts. The activities that she and her team monitor online, as well as the real-life protests and political rallies that take place, lead to “a growing interest in or commitment to these ideas,” she argues.
Researchers like Donovan knew that QAnon was prepared to accept the theory that the 2020 US presidential election had been rigged. They had previously watched QAnon merge with the anti-vaccine movement to support theories that the coronavirus was designed to make money for vaccine manufacturers.
Trump began promoting the idea that the election would be illegal when he suggested that postal ballot papers could be forged. At a rally on January 6, it came to a head when Trump told the participants: “If you don’t fight like hell, you will have no more land.” Then he asked them to march to the US Capitol just as Congress was preparing to certify Democrat Joe Biden as the next US President.
The false narrative about the choice has been a landmark – albeit uncomfortable – opportunity for researchers to examine how disinformation spreads across the Internet. In July, Starbird worked with Renee DiResta, chief researcher at Stanford Internet Observatory in California, and other members of the Election Integrity Partnership to track and correct disinformation on social media platforms such as Twitter, Facebook, and TikTok. The team is still searching its data, but Starbird says the work sheds light on how social media enables populist leaders like Trump to build constituencies and wield power.
In one case study, researchers pursued false claims that Sharpie pens given to voters in Illinois and Arizona resulted in damaged ballot papers that could not be read by voting machines. These claims, shaped by Trump’s tale of electoral fraud, came from his supporters on Twitter and were later reinforced by members of his own family and right-wing influencers, which helped spread the message further and get it mainstream. Efforts to correct the record, including placing warning signs on prominent tweets through Twitter, failed when the base-level narrative spread among smaller, unverified accounts, the researchers found.
“We see this interplay between the elites and their audiences who are actually working together to create fake narratives,” says Starbird. Social media is becoming a testing ground for ideas, which then gain momentum and are often picked up by conservative media outlets like Fox News, she adds. “We’re learning that mass media and social media are actually very integrated.”
Trump used this echo chamber to fuel conspiratorial thinking about the US election at all levels of the Republican Party: 147 Republicans in Congress voted against the confirmation of Biden’s election in the early hours of January 7, shortly after the uprising. In a national poll conducted days later, nearly half of Republicans questioned the election results and rejected Biden’s inauguration.
“Conspiracy theories are basically a form of political propaganda,” says Quassim Cassam, a philosopher at the University of Warwick in Coventry, UK. Although Trump failed to overthrow the elections, Cassam said the former president has been very successful in mobilizing his political base – and radicalizing the Republican Party.
A new world
Following the Capitol uprising, Twitter banned Trump, separated him from nearly 89 million followers, and deleted more than 70,000 accounts related to disinformation about campaign fraud and conspiracy theories. Facebook and Google’s YouTube have also blocked Trump’s accounts.
These actions suppressed the online conversation: The Starbird team analyzed its network of influential Twitter users and found that an entire section linked to QAnon had disappeared overnight (see “Disinformation Raid”). But Starbird says the extremists they’ve followed will always find new platforms to spread their dangerous ideas. Law enforcement agencies remain on high alert: On January 27, the Department of Homeland Security published a Terror Bulletin advising that ideologically motivated violent extremists protesting the president’s transition will continue to “mobilize, to incite or commit violence ”.
Despite still analyzing mountains of data, many disinformation researchers say it is already clear that new regulations will be needed to regulate the internet, the tech giants, and the content their users post online. Donovan said the Biden administration should conduct a full review of social media, including the algorithms that power search and recommendation engines, as well as how tech companies have benefited from the spread of disinformation and conspiracy theories.
“The gatekeeping power of the mass media has now shifted to these platform companies,” says Donovan. “We need them to be much more transparent about what they are doing and we need rules so that they know what the guard rails are.”