TikTok to diversify its ‘For You’ feed, let customers choose the matters they wish to keep away from – TechCrunch

TikTok announced this morning that it is taking a new approach to its “For You” feed – the short video app’s main feed, based on its algorithmic recommendations. The company has already detailed how its algorithm works to suggest videos based on the interaction patterns of users in its app, but acknowledges that too much of a given content category can be “problematic”. The company now says it is working on implementing new technology to break “repetitive patterns” in its app, and is also developing a tool that will allow users to have their say by letting them know what topics you want to avoid.

The company said in its announcement that “too much of everything, be it animals, fitness tips, or personal wellness trips, doesn’t fit the diverse discovery experience we’re trying to create.” However, TikTok isn’t diversifying its algorithm because people are complaining that they see too many cute puppy videos – this is because regulators are cracking down on the technology and questioning the harmful effects of untested recommendation algorithms, especially when it comes to teen mental health.

Facebook and Instagram executives, along with those of other social platforms, were dragged into Congress and asked how their apps directed users to dangerous content – including topics such as pro-anorexia and eating disorders.

TikTok mentions in its announcement the types of videos that could be harmful if viewed excessively, including videos on “Extreme Dieting or Fitness”, “Sadness” and “Breakups”. While a user who shows interest in videos of this type may find them interesting, the algorithm is not yet smart enough to know that repeating more of them could actually harm the user. This problem is of course not limited to TikTok. Across the board it is becoming clear that systems designed only to increase user interaction through automated means do so at the expense of the mental health of the user. While Congress is currently most interested in how these systems affect young people, some studies, though controversial, have shown that untested recommendation algorithms can also play a role in radicalizing users who might be attracted to extreme views.

TikTok says it will also test new ways to avoid recommending a range of similar content when users watch and interact with videos in these potentially harmful types of videos. But it only offered examples of the types of videos it would limit, not an exhaustive list.

Additionally, the company said it is developing technology that will help it recognize when a user’s “For You” page isn’t very diverse. While the user may not be watching videos that actually violate TikTok’s guidelines, the company said that watching “very limited types of content …

Another strategy TikTok wants to introduce involves a new feature that would allow users to control the algorithm themselves. You could use this feature to select words or hashtags associated with content that you don’t want to see on their “For You” feed. This would be an addition to TikTok’s existing tools for flagging videos that you don’t like, for example by tapping on “Not Interested”.

To be clear, TikTok’s announcement today is all about building a roadmap of its plans, not actually rolling out such changes and features. Instead, it’s an attempt to deter regulators from further investigating your app and its potentially harmful effects. His strategy was likely influenced by the type of questions he was asked during both his own Congressional hearing and that of his rivals.

TikTok advises that the actual implementation may take time and iterations before things go right.

“We will continue to examine how we can ensure that our system makes a wide variety of recommendations,” said the company.

Comments are closed.