News

TikTok enhances safety policies around young people and online challenges

The video-sharing platform has made changes following the publication of major research into teenagers and their interactions with some content.
The video-sharing platform has made changes following the publication of major research into teenagers and their interactions with some content. The video-sharing platform has made changes following the publication of major research into teenagers and their interactions with some content.

TikTok has unveiled a series of updates to its safety measures designed to better protect young people on the platform following a major report into how users interact with potentially harmful content.

The firm’s research had looked at how young people engaged with online challenges and hoaxes, including harmful ones which attempt to coax viewers into self-harm or suicide.

The video-sharing giant has said it will now start removing “alarmist warnings” about potentially harmful online challenges and hoaxes because its research found these warnings can exacerbate the problem by treating the hoax as real.

The study of 10,000 teenagers, parents and teachers from around the world – including the UK – found that while only 0.3% of young people said they had participated in an online challenge they would categorise as very dangerous, nearly half of those asked said they wanted more information and help on how to better understand risk.

It comes as social media platforms continue to face scrutiny over how their police harmful content on their sites and the steps they take to protect users – particularly younger ones.

As part of its policies updates in response to the research, TikTok said the technology it uses to alert its safety teams to increases in prohibited content linked to hashtags will now be extended to also capture “potentially dangerous behaviour” that attempts to hijack or piggyback on an otherwise common hashtag.

TikTok said it would also “improve the language” used in its content warning labels to encourage users to visit its Safety Centre for more information, and was adding new materials to that space aimed at parents and carers unsure on how to discuss the subject with children.

Alexandra Evans, TikTok’s head of safety public policy for Europe, said the aim of the project was to “better understand young people’s engagement with potentially harmful challenges and hoaxes”.

“While not unique to any one platform, the effects and concerns are felt by all – and we wanted to learn how we might develop even more effective responses as we work to better support teens, parents, and educators,” she said, adding that the company wanted to help “contribute to a wider understanding of this area”.

“For our part, we know the actions we’re taking now are just some of the important work that needs to be done across our industry, and we will continue to explore and implement additional measures on behalf of our community.”

TikTok has made a number of updates to its platform over the last year, particularly in areas around safety for younger users.

The app has increased the default privacy settings and reduced access to direct messaging features for younger users, as well as expanded its Safety Centre and online resources aimed at teenagers and their parents as part of efforts to offer more information and guidance.

It comes as MPs and peers scrutinise the draft Online Safety Bill, which intends to introduce substantial regulation to social media platforms, with large fines, sites being blocked and the potential to hold senior managers criminally liable for rule breaches among the possible penalties for failing to protect users from harmful content.