もっと詳しく

The TikTok algorithm immediately became the key to the success of the video streaming app, and it received the most criticism. But now it offers users the ability to filter out topics you don’t want to see.

The company is also introducing new auto-moderation tools, including one that (finally!) applies age restrictions to videos that are inappropriate for children, and another that aims to solve the “rabbit hole” problem where users are shown a string of depressive or other potentially dangerous videos…

Algorithm TikTok

TikTok differs from regular video streaming apps like YouTube in that its algorithm has much more control over what you see. Instead of users choosing the videos they want to watch, you just choose some initial interests and then the algorithm takes over.

TikTok determines your tastes using a range of cues, including the videos you watch as well as the ones you like, share, and follow.

This proved to be a hugely successful approach for the company, measured by both app downloads and app usage, but was also heavily criticized. One of the main criticisms was that it quickly puts users in “silos” where they only ever see a tiny subset of content.

A study last year showed that this could be dangerous.

One bot was programmed with sadness and depression as “interests”. Less than three minutes after starting using TikTok, on its 15th video, [bot] kentucky_96 stops there [sad video about losing people from your life]. Kentucky_96 watches a 35 second video twice. This is where TikTok guesses for the first time that maybe the new user has been feeling overwhelmed lately. […]

Instead, the user stops at a mental health video and then quickly scrolls through the video about missing their ex, giving advice on how to move on and how to keep their lover’s interest. But kentucky_96 lingers on this #depression hashtag video and these videos of suffering from anxiety.

After 224 videos in the bot’s total journey, or about 36 minutes of total watch time, TikTok’s understanding of kentucky_96 is taking shape. Videos about depression and mental health issues outnumber videos about relationships and breakups. Since then, the kentucky_96 feed has been a stream of depressing content. 93% of the videos shown on the account are about sadness or depression.

TikTok also appears to be extremely bad at filtering out particularly dangerous content, such as the “blackout challenge” that is said to have caused the deaths of seven children.

Keyword Filters

For the first time, TikTok is offering users the ability to filter out certain types of content by blacklisting certain words and hashtags.

Viewers can [already] use our “not interested” feature to automatically skip videos from the creator or those that use the same sound. To further empower viewers to customize their viewing experience, we’re implementing a tool that people can use to automatically filter out videos with words or hashtags they don’t want to see in their For You or Follow feeds – because you are you. just finished a home project and no longer need DIY tutorials, or if you want to see fewer dairy or meat recipes as you move towards more plant-based meals. This feature will be available to everyone in the coming weeks.

Video with age restrictions

TikTok is finally introducing age restrictions on videos that are not suitable for children. Previously, the app warned younger users that the video might be inappropriate, but allowed them to watch it. Now the company has finally banned children from watching such videos.

In the coming weeks, we will begin rolling out an early version to prevent overtly mature content from reaching audiences aged 13 to 17. When we detect that a video contains mature or complex themes, such as fictional scenes that may be too intimidating or intense for younger audiences, the video will be given a maturity score to help prevent anyone under 18 from viewing it on TikTok. .

TikTok’s algorithm will cut potentially harmful content

The TikTok algorithm is also trained to solve the rabbit hole problem of a stream of potentially malicious content.

Last year, we began testing ways to avoid recommending a series of similar content on topics that might be great as a single video but potentially problematic on repeat viewing, such as topics related to diet, extreme fitness, sadness, and other wellness topics. We also tested ways to recognize whether our system might inadvertently recommend a narrower range of content to the viewer.

As a result of our testing and iterations in the US, we’ve improved the viewing experience so that viewers now see fewer videos about these topics at a time.

Photo: Florian Schmetz/Unsplash

The post The TikTok algorithm allows for personal choice; solves the rabbit hole problem appeared first on Gamingsym.