DHS: Extremists used TikTok to advertise Jan. 6 violence

The DHS alert shows concern that TikTok – which is already under surveillance for the possibility of people being sent to China, allegations the company denies – has become a hotbed of extremist activity and that law enforcement agencies are using a platform that tends to be associated with viral dance, need to pay more attention to videos than right-wing extremism.

The transparency watchdog Property of the People received the document via an open records request and passed it on to POLITICO. The group conducted an extensive Freedom of Information Act investigation into the January 6 attack on the Capitol.

In a response to POLITICO, TikTok said it was working to fight extremism. “TikTok has absolutely no place for violent extremism or hate speech, and we are working aggressively to remove such content and banning anyone who violates our community guidelines,” said spokesman Jamie Favazza in an email.

So far, security agencies have paid more attention to Facebook and YouTube for their role as potential breeding grounds for hate speech and real-world violence. But as young people turn away from their parents’ social networks and turn to newer upstarts, these platforms like TikTok become a source of concern and possible radicalization.

DHS said it issued the warning specifically about TikTok, in part because “some homeland security actors have limited knowledge of its functionality”.

While the document shows that the DHS is trying to keep up with extremists’ preferred practices and platforms, it can still lag behind the curve.

Seamus Hughes, assistant director of the extremism program at George Washington University, said US national security agencies are struggling to keep up with changes in social media.

“Extremism research itself is pretty slow on TikTok,” said Hughes, whose organization receives a DHS grant. “There’s something to be said about the demographics of researchers – they tend to get older. Very few can hear the first five seconds of a TikTok video and know which song it refers to. “

Hughes said TikTok is very efficient at delivering extreme content to its users and that it is “flooded” with videos promoting the QAnon conspiracy theory. (TikTok announced a ban on QAnon content last year.)

“The TikTok algorithm is so good that before you know it you are in a spiral of domestic violent extremism,” he said.

The DHS report made a similar point.

“TikTok’s application layout and algorithms can inadvertently aid individual efforts to promote violent extremist content,” the report said.

“A user’s account may be unfollowed, but some videos can have significant viewership, which could help violent extremist TikTok users evade TikTok’s content moderation efforts,” it said.

The DHS document identifies several cases of extremist posts promoting violence in the Capitol during 2020 and in the run-up to the Capitol uprising. Prior to the January 6 riots, a TikTok user posted a video asking protesters to bring firearms. Other users shared videos in early to mid-2020 with instructions on how to sabotage railroad tracks, access the White House through tunnels, and disrupt “the US National Guard in riot,” the warning said, citing DHS and law enforcement reports.

Groups involved in the January 6 violence used a number of digital platforms to share and organize debunked allegations of electoral fraud. The House of Representatives special committee investigating the attack has asked tech companies including Facebook, Twitter, Parler and TikTok to hand over tons of internal documents so that policy makers can understand their role in the violence.

But the DHS warning released months after these protests shows his concern about extremism on TikTok is growing.

The DHS warning said a US intelligence center had also found evidence that foreign extremists were using TikTok, including a pro-ISIS group that posted an English-language video in August 2020 with instructions on “making explosive links.” And it said that in October 2019, “ISIS militants overseas posted videos from 24 TikTok accounts showing ISIS militants with bodies, weapons and other people declaring their support for sectarian violence and ISIS,” citing on reports from information exchange hubs used by law enforcement.

The DHS document added that TikTok deleted these accounts after a newspaper tagged them.

The ministry also cited the condemnation of a Pakistani imam in Paris for promoting terrorism on the platform by inciting violence against non-Muslims and raving about the terrorists who attacked journalists at Charlie Hedbo, a French magazine, local media reports according to.

Both domestic and foreign groups “are taking advantage of standard features of the platform to bypass the platform’s detection and removal efforts,” the document concluded. OODA Loop, a website for a global strategy consultancy, first reported on the document and made parts available.

The DHS document together outlines an overview of the extremist threat to law enforcement in the country. Domestic and foreign groups have been active on the platform since at least 2019 and “use TikTok to recruit followers, promote violence and disseminate tactical instructions for various terrorist or criminal activities”.

However, the five-page analysis is limited in scope and mainly provides an overview of how the app works and what extremists can use the platform to post violent or hateful material.

In recent years, national security agencies have increased their focus on combating the rise of white racists and far-right groups based in the United States. But the US counter-terrorism apparatus is still much more focused on fighting foreign groups. The DHS document highlights how federal agencies are becoming aware of the potential threat posed by national and international groups using the network to radicalize people

White racists, neo-Nazis, and Islamic extremists have flooded TikTok in recent years, often using some of its signature features – like the ability to include multiple videos in the same post – to create viral content that promotes anti-Semitic and anti-LGBTQ + news . In June, the Institute for Strategic Dialogue, a think tank tracking online extremism, found more than 1,000 such videos in a month, including one showing a replica of the Auschwitz concentration camp built using the Minecraft video game and others, the fascist crushes extol leaders from the 1930s.

TikTok then removed all of these videos.

In the first three months of 2021, TikTok said that within 24 hours of posting this material, it removed more than 90 percent of posts that violated its content guidelines. The company removed more than 61 million videos for violating policies and guidelines during that period, it said earlier this year, adding that this represented “less than 1% of all videos uploaded to TikTok.”

Still, the Chinese app’s efforts to suppress content promoting white racists and indigenous terrorists contrast with its robust approach to other controversial content.

Compared to other social media platforms, TikTok has a track record of aggressively using content filters and automated algorithms to delete material that is classified as problematic before it can gain a large online following. And content moderation practices have sparked a number of controversies.

A top manager at TikTok told British lawmakers last year that the app had previously censored content related to the “Uyghur situation” in order to keep conflicts away from the platform. Beijing’s suppression of Uyghurs and other Muslim minorities living in China has been condemned by human rights groups. The executive later went back to her testimony and said she had pronounced incorrectly.

The Intercept reported last year that TikTok executives urged moderators to “suppress posts made by users deemed too ugly, bad, or disabled for the platform.” A TikTok spokesman told The Intercept that these rules were “an early outspoken attempt to prevent bullying” and are no longer in force. TikTok has also apologized for suppressing LGBTQ + content, Reuters reported.

And MIT Technology Review recently detailed an episode in which a TikTok product allowed content promoting Nazism and anti-Semitism, but automatically removed posts labeled “Black Lives Matter” and “Support Black Success.”

Comments are closed.