WSJ’s deep dive into consuming dysfunction rabbit holes on TikTok explains a sudden coverage change

A troubling report in the Wall Street Journal looks at the personal experiences of young girls who were sent down rabbit holes by TikTok with extreme weight loss challenges, detox techniques, and deadly diets that contributed to the development of eating disorders or exacerbated existing ones. . The WSJ ran its own experiment to see how TikTok’s algorithm could potentially promote this type of malicious content – its results could explain TikTok’s sudden decision to change the way its video recommendation system works.

As noted in the report, the WSJ created over 100 accounts “that crawled the app with little human intervention,” of which 12 were bots registered to 13-year-olds who spent time watching videos about weight loss, alcohol, and gambling. A graphic included in the report shows that as soon as one of the bots abruptly stopped watching gambling-related videos and started spending time on weight loss videos, TikTok’s algorithm adjusted accordingly. it quickly increased the number of weight loss videos the bot saw explaining this behavior change.

“While this experiment doesn’t reflect most people’s experience of TikTok, even one person having this experience is one too many.”

At the end of its experiment, the WSJ found that of the 255,000 videos the bots had seen in total, 32,700 contained a description or metadata that matched a list of hundreds of weight loss keywords. 11,615 videos contained text descriptions that matched keywords relevant to eating disorders, while 4,402 contained a combination of keywords suggestive of eating disorders normalization.

A number of these videos have reportedly used different spellings for keywords related to eating disorders to avoid being reported by TikTok. After the WSJ alerted the platform to a sample of 2,960 videos related to eating disorders, 1,778 were removed – the WSJ says it is unclear whether it was removed by TikTok or the creators themselves.

Just a day before the WSJ’s report was released, TikTok announced that it was working on new ways to stop these dangerous rabbit holes from forming. That change also came just days after the WSJ announced that it had contacted TikTok with a comment on its upcoming story.

In its post, TikTok acknowledges that watching certain types of content over and over again, including videos about extreme dieting and fitness, is not always healthy. It is now working to determine if its recommendation system is inadvertently serving up videos that may not violate TikTok’s guidelines, but could be harmful if consumed in excess. The platform also says it is testing a tool that will allow users to prevent videos with certain words or hashtags from appearing on their For You page.

“While this experiment doesn’t reflect most people’s experience of TikTok, even one person having that experience is one too many,” TikTok spokesman Jamie Favazza said in a statement to The Verge. “We allow educational or recreational content because we understand that it can help people see hope, but content that promotes, normalizes, or glorifies eating disorders is prohibited.” The spokesperson also pointed out that TikTok is within the App provides access to the National Eating Disorder Association hotline.

If this situation sounds familiar to you, it’s because Instagram has already been through (and is still dealing with) it. After whistleblower Francis Haugen leaked the Facebook Papers, a collection of insightful internal Facebook documents, Instagram quickly worked to plug the holes in its sinking ship.

The papers show that Facebook conducted its own research into the effects of Instagram on teenagers and found that the app could wreak havoc on teenagers’ mental health, as well as worsening body image in young girls. About a month later, Instagram announced its plans to roll out a feature that would discourage teenagers from viewing potentially harmful content. In addition, a “take a break” function has been introduced, which prompts users to close the app when they have spent a certain amount of time on the platform, between 10, 20 or 30 minutes.

Comments are closed.