Why TikTok Is Being Accused Of Selling Far-Proper Extremist Accounts

Extremist content is included in every social app. According to a recent study, it is especially easy for TikTok users to recommend malicious and radical videos.

Tick ​​tock is widely considered to be one of the hottest social media outlets available in 2021, but all that attention also comes with a fair share of criticism. The latest is a new report criticizing TikTok for promoting far-right accounts among its users. All social networks have had numerous problems dealing with extremist content in recent months and years. Facebook, Twitter, TikTok and other websites receive an enormous number of uploaded posts / videos on a given day. While moderation guidelines are in place on all of these sites, malicious content can slip between the cracks.

In this case, it is up to these social networks to ensure that there are as few eyes as possible. For example, if someone watches a video about a conspiracy theory on YouTube, chances are they will be recommended other videos about that conspiracy to watch next. These recommendations are meant to be harmless, but when extremist content is added into the mix, it can play a huge role in further radicalizing the person on the other end. The For You page on TikTok is known to be good at pinpointing a user’s interests and feeding them videos based on their preferences. This latest report suggests that it has the potential to lead people down the rabbit hole while feeding dangerous videos.

Related: TikTok: Why YouTube Shorts Is The Platform To Dethrone Short Video Royalty Fees

In a study conducted by MediaMatters, the publication followed accounts on the For You page and looked at which other accounts are recommended by TikTok’s Suggested Accounts feature. In one example, MediaMatters followed a QAnon account on the For You page and was recommended shortly after to follow another QAnon account and then a Three Percenter account (an extreme right-wing, anti-government militia group). There are five other scenarios that have worked out very similarly: MediaMatters follows an extremist report and is then fed piles of related scenarios to keep the momentum going.

How TikTok’s recommendations work

TikTok logo

Similar to many other apps, an algorithm is at the center of TikTok’s recommendations. It analyzes the types of videos a user sees and interacts with, and based on that data, it can make personalized recommendations for that user. While this is fine when someone is watching videos about cute puppies, comedy bits, or anything else, it’s obvious that it can go from harmless to dangerous depending on the content in question.

While this is not an issue unique to TikTok (echo chambers can be created on any social platform), it can be more damaging here compared to some of its competitors. In addition to the power of TikTok’s For You feature, TikTok’s user base is extremely impressive too. According to a Statista report from June 2020, 32.5% of TikTok users are between 10 and 19 years old. When combining young users with readily available extremist videos, it is not a good combination.

It’s one thing if someone is actively looking for extremist content, but when these videos appear on someone else’s For You page, they are watching them and it is recommended that you keep digging deeper and deeper. These are the most worrying cases and at the moment it is difficult to know what the solution is.

Next: Instagram Reels Vs. YouTube Shorts: Best TikTok Alternative

Source: MediaMatters, Statista

iPad iPhone security

Why iPhone and iPad users should update to iOS and iPadOS 14.4.2 soon

About the author

Joseph Maring
(112 articles published)

Joe has been actively writing and speaking about consumer tech since 2012. His biggest passion is smartphones, but he’s happy to tell you about pretty much anything with a CPU. He lives in Kalamazoo, MI with his wife, two cats, and a pit bull / boxer mix.

More from Joseph Maring

Comments are closed.