• Call-in Numbers: 917-633-8191 / 201-880-5508

  • Now Playing

    Title

    Artist

    Tiktok has revealed greater insight into how its recommendation algorithm works in a new blog post.

    The algorithm uses a number of factors to base its suggestions of 15-second clips.

    Theses include user interactions, such as the videos a user likes or shares, accounts they follow, and comments posted as well as video information such as sounds and hashtags.

    Download the new Independent Premium app

    Sharing the full story, not just the headlines

    Tiktok will use your device and language settings to “make sure the system is optimised for performance” but receive lower weight compared to the other metrics “since users don't actively express these as preferences.”

    Many of TikTok’s algorithm influences will be familiar; similar features are offered by other social media sites such as Twitter and YouTube upon setting up a new account.

    When a user first signs up to TikTok, they will be asked to choose categories of interests such as ‘pets’ or ‘travel’.

    For those who do not select videos, they are given a generalised feed of popular videos, with a user’s likes, comments, and replays pushing the recommendation algorithm towards the content you like.

    Long pressing to add a video to a user’s favourites, or hitting the ‘Not Interested’ button, will also affect how content similar to that video will be recommended to you.

    One of the strongest indicators of interest is whether a user finishes watching a longer video from the start to finish, which adds greater weight to the recommendation algorithm than other factors such as geolocation, TikTok says.

    “Videos are then ranked to determine the likelihood of a user's interest in a piece of content, and delivered to each unique For You feed.”

    While TikTok users with larger followings are likely to get more views by the nature of social media, TikTok says that this does not affect the recommendation as much as the actual content uploaded as “neither follower count nor whether the account has had previous high-performing videos are direct factors in the recommendation system.”

    As such, unless a popular TikTok user started posting about content you were specifically interested in, such as Tony Hawk exchanging skateboards with a six-year-old, or a prisoner who started his own cooking show, or tips on how to remove bugs from strawberries, it’s unlikely that you would ever see them on your For You feed.

    That said, TikTok will still occasionally splice in content you would not normally see in order to keep the feed diverse, which can be recommended from other users with similar interests.

    “Your For You feed generally won't show two videos in a row made with the same sound or by the same creator. We also don't recommend duplicated content, content you've already seen before, or any content that's considered spam,” TikTok says.

    TikTok also uses that data to get a “better sense of what's popular among a wider range of audiences,” so they can perfect the balance between content the user wants to see, and new content the user might enjoy.

    Many companies keep their recommendation algorithms a closely guarded secret, although that has resulted in much criticism when the system makes mistakes because of company decisions or bad data.

    Recently, TikTok itself had to apologise to Black users as it appeared to hide Black Lives Matter posts or George Floyd hashtags, while protests continue in the US and UK.

    TikTok claimed that the posts were actually visible, and viewable, on the site, but that a “display issue” bug affected large hashtags.

    However, the company apologised to the Black community, saying that it “know[s] we have work to do to regain and repair that trust.“

    Similarly, Instagram has said it needs to change how its algorithm works to “address the inequalities Black people face” such as how it filters content and whether people get ‘shadowbanned’.

    The Facebook-owned photo-sharing site said that it would be releasing more information about the types of content it does not recommend on its Explore tab “and other places”.

    Read More


    Reader's opinions

    Leave a Reply