Russia on verge of throwing out Chinese language TikTok, launches ‘selfmade Tiktok’

Russia is on the verge of ditching TikTok, the short video creation app owned by China-based ByteDance Ltd. is located. The Russian government launched homemade TikTok called Yappy to capitalize on Tiktok’s popularity. There are currently 70 million monthly TikTok users in Russia. The Russian government, which has amicable ties with China, appears to be moving slowly with the impending cancellation of TikTok because of its overseas connections. Russia fears that foreign tech companies could ruin the mindset of its people.

Russia’s leading media group recently launched a domestic rival to the hugely popular TikTok video sharing app. Russian media called it Russia’s campaign to reduce the influence of foreign websites and technological advances. Gazprom Media, a subsidiary of the state gas giant Gazprom, has launched the service called “Yappy”, which can be downloaded from the Apple App Store and Google Play. The Yappy app was developed with support from the Innopraktika Foundation, an organization run by Katerina Tikhonova, one of the alleged daughters of President Vladimir Putin. The service has a number of similar features to TikTok and is based on splitting short vertical video clips of up to 60 seconds in length.

What prompted Russia to offer a homemade TikTok alternative?

Russia resented the Chinese-made TikTok app for certain offensive posts targeting children. The posts reportedly incited children to participate in unauthorized street protests in support of imprisoned Kremlin critic Alexei Navlany.

TikTok came under fire along with US tech giants for refusing to remove posts and ignoring requests from the Russian government.

The Russian court even fined TikTok 2.5 million rubles ($ 34,000) for failing to delete illegal content encouraging minors to participate in unauthorized protests in Moscow.

Tiktok – most popular video sharing app in Russia

Without a doubt, the Chinese video sharing app TikTok has been the most downloaded phone application in Russia and elsewhere in the world. But now Russian Gazprom-Media has got into the business to capitalize on TikTok’s popularity.

In late 2000, Russia announced its decision to create a domestic alternative to TikTok.

Russia has since named TikTok one of 13 international social media platforms to open an office on Russian soil by the end of 2021 – the latest law that, according to critics, was designed to suppress the dominance of foreign tech companies and social media platforms in Russia. Whether in Russia or Australia, TikTok is under attack for adverse health effects, especially those of children.

Controversy over TikTok’s algorithm

The powerful TikTok algorithm is like nothing the world has seen before. Tiktok is alleged to have shared data with the Chinese government. A joint research by the Australian Broadcasting Corporation’s Triple j’s Hack and Four Corners found that the TikTok algorithm exposes Australians to dangerous content while controlling which people and political movements grab users’ attention. The research has revealed startling revelations: TikTok, for example, claims its mission is to “inspire creativity and bring joy,” but risks distorting the way a large part of a generation sees the world, and not always for the better . Upon signing in, TikTok begins collecting data about the user’s location, gender, and age, and, more controversially, their facial data.

The more a user clicks on “Like” videos, follows an account or watches a TikTok video to the end, the more the algorithm learns about the interests of the user.

It is very hard to break this cycle and it is intended that the user never actually reach the end of the content. The longer a user lets you scroll, the more ads the user is likely to do. That catapulted the Chinese TikTok parent company ByteDance to a value of more than 250 billion dollars.

Researchers have claimed that TikTok promotes eating disorders. Researchers say there are many factors that contribute to eating disorders. Tiktok’s algorithm looks for vulnerable people and then plays with that vulnerability. Dr. Suku Sukunesan from Swinburne University advised TikTok on how to make the app more secure. He embedded himself in the app’s eating disorder communities.

“I was immediately given all of these eating disorder contents. After a few hours, TikTok suggested 30 different accounts to follow and they were all people living with eating disorders, ”he said. According to Dr. Sukunesan, these TikToks effectively teach people how to have an eating disorder, and the algorithm can lead them to more serious videos, such as videos that encourage self-harm. “It’s almost like an endless pit and you will find that these kids would end up doing more harm to themselves,” he said.

The company’s policy stated that TikTok “prohibits content that depicts, encourages, normalizes, or glorifies activities that could result in suicide, self-harm, or eating disorders.”

A user attempted to report videos promoting eating disorders only to learn that they are not violating any of TikTok’s policies. TikTok’s response to this issue is to ban hashtags for pro-eating disorders so that users cannot search for these videos. When it does, a number appears for the Eating Disorder Support Service, The Butterfly Foundation. “Our teams are consulting with NGOs and other partners to keep the list of keywords on which we intervene updated,” said a TikTok spokeswoman.

Another TikTok user told Hack and Four Corners that when she reported a viral video of a man who committed suicide, it was also found that it did not violate the app’s community guidelines. According to several researchers, it takes less than 30 seconds to find malicious content on TikTok and a few hours for the algorithm to dominate a person’s feed with offensive videos. Tech advocacy organization Reset Australia conducted experiments and found that it took about four hours for the algorithm to detect that a 13-year-old was interested in racist content and about seven hours for sexist videos to feed someone flood. The longer these users watch this type of content, the more frequently it appears.

While TikTok came under pressure to delete malicious videos, it was also accused of using the algorithm to censure and suppress posts for the wrong reasons. In July, several black influencers went on strike indefinitely, refusing to choreograph the viral dances that TikTok relies on, and accusing the app of using their creativity without favoring them in the algorithm instructed to suppress posts from creators who did are considered “ugly, bad or disabled”.

Last year, TikTok apologized for suppressing posts with the hashtags “Black Lives Matter” and “George Floyd” after thousands of YouTubers went on the platform to protest the suppression of their videos or accounts being blocked. The Australian Strategic Policy Institute (ASPI) conducted the first academic research into TikTok’s censorship, finding that the company was actively using its algorithm to hide political statements it deemed controversial.

The US State Department-funded study found that hashtags about the mass incarceration of Uyghurs, protests in Hong Kong, LGBTQI and anti-Russian government videos were suppressed. In a statement, TikTok denies that the company was involved in censorship.

–IANS

pgh /

(Only the headline and image of this report may have been revised by Business Standard staff; the rest of the content is automatically generated from a syndicated feed.)

Comments are closed.