The updates come after a range of reports of significant injuries as a result of TikTok challenges, along with advice from experts around exposure to certain types of material, and how it can impact TikTok’s overwhelmingly young audience.
TikTok says that all users will be notified of the updates in the app over the coming weeks, but here’s a look at the specific elements of focus with these changes.
First off, TikTok’s updating its rules around challenges and dangerous acts, with a specific focus on suicide hoaxes, in order to reduce the reach and exposure of such trends.
As per TikTok’s updated guidelines:
“Content that encourages or promotes suicide or self-harm hoaxes is not allowed. This includes alarming warnings that could cause panic and widespread harm. We will remove such warnings, while allowing content that seeks to dispel panic and promote accurate information about such hoaxes.”
Last November, TikTok conducted a large-scale study to assess the potential dangers of users participating in viral challenges after various reports of significant harm as a result of such. One of the key findings of that report was that even warnings about suicide hoaxes can cause angst, by inadvertently giving them credibility, adding to fears.
“The research showed how warnings about self-harm hoaxes – even if shared with the best of intentions – can impact the well-being of teens by treating the self-harm hoax as real.”
TikTok pledged to take action on this element, leading to this update, which will now broaden its enforcement action against such content.
This is a key focus for the platform. Last year, in Italy, a 10- year-old girl died after taking part in a ‘blackout challenge’ in the app, which lead to Italian authorities forcing TikTok to block the accounts of any users whose age it could not verify. The popular ‘Milk Crate Challenge’, which trended earlier this year, also saw many people suffer serious injury after trying to climb stacks of plastic crates, while other concerning trends include the ‘Benadryl challenge’, full face wax, the ‘back cracking challenge’ and more.
Self-harm hoaxes generally involve directing people to carry out a series of harmful activites, which escalate gradually, and can eventually lead to self-harm, and even suicide. Trends like the ‘Blue Whale Challenge’ and ‘Momo’ are among this more concerning element, where fictional characters play out a horror-like scenario that can drag users into dangerous behavioral pathways.
The new updates will see TikTok working to remove even more of this content, and related elements, as it works to address concerns.
TikTok’s also launching a new push to highlight danger and risk in trending clips.
“As part of our ongoing work to help our community understand online challenges and stay safe while having fun, we’ve worked with experts to launch new videos from creators that call on our community to follow four helpful steps when assessing content online – stop, think, decide and act. Community members can also view these videos at our #SaferTogether hub on the Discover page over the next week.”
By incorporating clips from popular creators, TikTok will better highlight these questions, which could get more users to re-think participation in potentially harmful trends.
Because as noted, people are getting injured in their efforts to achieve in-app popularity. You only need one good take to make a great TikTok clip, but it’s not worth risking significant injury – or worse – over, and as the host platform, TikTok does have a responsibility to police such trends where it can.
TikTok’s also expanding its approach to eating disorder-related content.
“While we already remove content that promotes eating disorders, we’ll start to also remove the promotion of disordered eating. We’re making this change, in consultation with eating disorders experts, researchers, and physicians, as we understand that people can struggle with unhealthy eating patterns and behavior without having an eating disorder diagnosis.”
The new push will aim to identify related concerns, like over-exercise or certain types of fasting, which can contribute to eating disorders, which will help to address the issue more broadly, and combat potential harm.
TikTok’s also updating its rules around misgendering and misogyny to combat hate speech, as well as content that supports or promotes conversion therapy programs.
“Though these ideologies have long been prohibited on TikTok, we’ve heard from creators and civil society organizations that it’s important to be explicit in our Community Guidelines. On top of this, we hope our recent feature enabling people to add their pronouns will encourage respectful and inclusive dialogue on our platform.”
Finally, TikTok’s also adding in more elements to expand its focus on unauthorized use of its platform, including new rules against spamming, crawling TikTok for user info, and other exploits.
“In addition to educating our community on ways to spot, avoid, and report suspicious activity, we’re opening state-of-the-art cyber incident monitoring and investigative response centers in Washington DC, Dublin, and Singapore this year. TikTok’s Fusion Center operations enable follow-the-sun threat monitoring and intelligence gathering, as we continue working with industry-leading experts to test and enhance our defenses.”
These are key elements for TikTok as it continues to expand, and potential misuse of the app rises in alignment. More access to more people means more ill-intentioned groups will seek to utilize its platform for such, and again, with so many young users, TikTok needs to do all it can to provide protection, where possible, and alerts.
It’s a never-ending battle, as every update and shift in policy sees bad actors also update their tactics, but it is important for TikTok to both be clear in its approach on such, and to take action at every turn.
You can read TikTok’s updated Community Guidelines here.