![]() “We also acknowledge that what we’re striving to achieve is complex and we may make some mistakes,” he added.Tik Tok is a Chinese video-sharing app with a user base of around 57% just in China. “As we continue to build and improve these systems, we’re excited about the opportunity to contribute to long-running industry-wide challenges in terms of building for a variety of audiences and with recommender systems,” wrote TikTok Head of Trust and Safety Cormac Keenan in a blog post. ![]() To what extent TikTok’s new tools actually make an impact on who sees what content still remains to be seen. It has also allowed hateful content that involved misogyny, white supremacy or transphobic statements to fall through the cracks at times, along with misinformation. So far, TikTok hasn’t yet been able to tamp down on problematic content in a number of cases - whether it’s kids destroying public school bathrooms, shooting each other with pellet guns or jumping off milk crates, among other dangerous challenges and viral stunts. The company says it’s currently training this system to support more languages for future expansion to new markets.Īs described, this trio of tools could make for a healthier way to engage with the app - but in reality, automated systems like these tend to have failures. ![]() For instance, it can be difficult to separate out content focused on recovering from eating disorders, which could have both sad and encouraging themes. TikTok admits the system still requires some work due to the nuances involved. following a 2021 Congressional inquiry into social apps like TikTok and others as to how their algorithmic recommendation systems could be promoting harmful eating disorder content to younger users. Related to these new features, the company said it’s expanding its existing test of a system that works to diversify recommendations in order to prevent users from being repeatedly exposed to potentially problematic content - like videos about extreme dieting or fitness, sadness or breakups. The forthcoming Content Levels system is meant to provide a means of classifying content on the app, similar to how movies, TV shows and video games also feature age ratings. With the new tools, TikTok aims to put more moderation control into the hands of users and content creators. Meanwhile, former content moderators sued the company for its failure to support their mental health, despite the harrowing nature of their job. This is an area where TikTok today is facing increased scrutiny - not only from regulators and lawmakers who are looking to tighten their grip on social media platforms in general, but also from those seeking justice over social media’s harms.įor instance, a group of parents recently sued TikTok after their children died after attempting dangerous challenges they allegedly saw on TikTok. Together, the features are designed to give users more control over their TikTok experience while making the app safer, particularly for younger users. It’s also preparing the rollout of a new tool that will allow users to filter videos with certain words or hashtags from showing up in their feeds. Today, the company is introducing the first version of this system, called “Content Levels,” due to launch in a matter of weeks. Earlier this year, TikTok said it was developing a new system that would restrict certain types of mature content from being viewed by teen users.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |