TikTok movies that promote anorexia are misspelling frequent hashtags to beat the ‘pro-ana’ ban

The TikTok logo will be displayed on a phone in China on March 3, 2020. Sheldon Cooper / SOPA Images / LightRocket via Getty Images

  • TikTok said it banned six accounts that were reported to it for posting content promoting eating habits that are likely to cause health problems in its recent efforts to combat harmful content.

  • The app is rich in dangerous material including “pro-ana” or pro-anorexia and “pro-mia” or pro-bulimia content that has plagued other social networks like Tumblr in the past.

  • Its algorithm means that users engaging with malicious content will continue to let TikTok display it. Even if they don’t bother with it, it may show up on their For You page regardless.

  • The app has 800 million active users, 41% of whom are between 16 and 24 years old, according to the Datereportal and Globalwebindex.

  • You can find more stories on the Insider homepage.

TikTok said it banned six accounts from posting content promoting eating habits that is likely to cause health problems in its recent efforts to combat harmful content.

The ban came in early December after a Guardian investigation found pro-anorexia material on the platform was still searchable.

The app struggled with dangerous material, including “pro-ana” or “pro-anorexia” and “pro-mia” or “pro-bulimia” content that has plagued other social networks like Tumblr in the past.

The platform banned #proana and #anorexia content, which had 2.1 million and 446,000 views, respectively, in the summer, according to Mashable UK. Users who searched for them were redirected to a support page titled “Need Help?” Forwarded.

However, if they are entered as words rather than hashtags, the content remains accessible, and users have also been able to circumvent the ban by frequently using incorrect spelling terms for popular search terms.

Tom Quinn, director of external affairs for the eating disorders charity, told Insider, “It is important that TikTok continue to update its security practices to reduce its users’ ability to post or find harmful content, including identification Misspellings of popular hashtags are used to bypass the rules. “

The story goes on

Danae Mercer, a UAE-based journalist and body image influencer, began talking about body confidence for a variety of reasons, including when she was struggling with an eating disorder at the age of 19.

She told Insider, “Social media, like the pro-ana communities on Tumblr and websites discussing celebrity weight loss, has really escalated my disease. That’s why I’m trying to make social media a safer place to be needed when i was sick.

“These communities are massively driving eating disorders and helping them become a cycle in which the disease is encouraged, tips exchanged and cases of illness are welcomed.”

How TikTok works

TikTok has 800 million active users worldwide, 41% of whom are between 16 and 24 years old, according to Datereportal and Globalwebindex.

TikTok differs from other networks through its FYP (For You Page) algorithm, the inner workings of which the Chinese app recently unveiled in order to become more transparent.

When a video is uploaded, it will be shown to a small group of users, whether or not they are following the account it came from.

If they engage with the video by liking, sharing, or even watching it, it will be presented to an even larger group.

If that larger group responds well, it will be displayed to an even larger group and the process will continue until the clip eventually goes viral.

When users deal with malicious content, TikTok continues to display it and trap it in a vicious circle. Even if they don’t bother with it, it may show up regardless of their FYP.

In a statement, TikTok said that users searching for content related to eating disorders will be directed to the National Eating Disorder Association (NEDA) hotline and that advertising of fasting apps and weight loss gimmicks will be banned until under the age of 18.

Last year, Pinterest also resolved that problem by training its algorithm to recognize content that encourages self-harm, and the company reports that those Pins are down 88%, Wired reported.

When users are specifically looking for the remaining Pins, Pinterest recommends a series of wellbeing exercises for users instead.

Read the original article on Insider

Comments are closed.