Anti-Vaxx TikTok Are Being Seen by Children As Younger As 9

  • A study shows that children are vulnerable to COVID misinformation on TikTok within minutes of signing up.
  • Although TikTok prohibits users under the age of 13, younger kids can easily lie about their age in order to sign up.
  • Social media companies continue to face public criticism for their impact on young users.

Loading Something is loading.

At this point, it’s no secret that social media algorithms inadvertently help spread COVID misinformation to millions of users. The more pressing problem is who this content is aimed at.

The popular TikTok social media app provides young children with misinformation – even within minutes of signing up. False information was aimed at children as young as nine years old, even if the young users neither followed nor searched for that content.

According to a report by media ratings firm NewsGuard, researchers found that COVID-related misinformation reached eight of the study’s nine children within the first 35 minutes on the platform, with two-thirds of participants seeing incorrect information about the COVID vaccines. This included content on unsubstantiated claims about COVID and the vaccine and homeopathic remedies for COVID.

“TikTok’s failure to stop the spread of dangerous health misinformation in its app is untenable and verges on dangerous,” Alex Cadier, NewsGuard’s UK editor-in-chief, who co-authored the report, told the Guardian. “Despite claims to address misinformation, the app still allows anti-vaccine content and health hoaxes to spread relatively freely.”

NewsGuard conducted the study in August and September, encouraging children ages nine to 17 with different cultural backgrounds to create accounts on TikTok. Although the platform restricts full access to the app for users under the age of 13, the three youngest users have been able to create accounts unaided. According to Statista, a quarter of the 130 million active monthly TikTok users in the US were between the ages of 10 and 19 as of March 2021.

“TikTok is very bad at removing misinformation videos, and those vaccine misinformation videos stay on the platform for months,” said Katrine Wallace, epidemiologist at the University of Illinois School of Public Health, who battles misinformation about Tik Tok. to insiders. “The more viral these videos get, the more eyes they will see, and unfortunately, due to the nature of the algorithms, some will be kids.”

TikTok’s Community Guidelines prohibit “incorrect or misleading” content regarding COVID-19 and its vaccines, and the company employs teams to work to identify and remove misinformation and to approve all COVID-related content on a case-by-case basis evaluate.

The app also said it is pushing for an “age-appropriate experience,” discouraging and removing accounts created by underage users, and restricting LIVE and direct messaging capabilities for younger teens. Douyin, the Chinese version of TikTok, announced in September that users under the age of 14 could limit the app to 40 minutes a day.

TikTok did not respond to a request for comment on the NewsGuard report.

In addition to TikTok, other platforms such as Facebook, Instagram and Twitter have also come under fire in recent months, as the increased transparency of companies has revealed more about the effects of social media on society, especially on the younger generation. This week, a Facebook whistleblower helped shed light on how its platforms are psychologically damaging to young users. Meanwhile, high-profile influencers on social media continue to spread misinformation about COVID and increase the amount of harmful content targeted at younger viewers.

Comments are closed.