TikTok Enhances Age-Verification Technology Across EU

TikTok is set to implement new age-verification technology throughout the European Union in the coming weeks. This initiative responds to increasing calls for stricter regulations on social media use among young people, particularly in the UK, where discussions about an Australia-style ban for users under the age of 16 are intensifying. The technology aims to better identify and remove accounts belonging to children.

The age-verification system has been piloted in the EU for the past year, analyzing profile information, posted videos, and behavioral signals to determine if an account may belong to a user under the age of 13. TikTok has stated that accounts flagged by this system will undergo a review by specialist moderators rather than face automatic bans, with the possibility of removal following this assessment. Notably, the UK pilot program previously resulted in the removal of thousands of accounts.

The push for more effective age verification is not limited to TikTok. Other major platforms popular among younger audiences, such as YouTube, are also facing heightened scrutiny. In December 2023, Australia imposed a ban on social media accounts for individuals under 16, leading to the removal of over 4.7 million accounts across 10 platforms, including TikTok and Instagram, since the ban was enacted on December 10, 2023.

Regulatory Scrutiny and Public Concerns

European authorities are increasingly focused on how social media platforms verify user ages, especially in light of data protection regulations. Recent comments from Keir Starmer, the UK Labour leader, highlight growing concerns about the impact of excessive smartphone use on children and teenagers. Starmer expressed alarm at reports indicating that five-year-olds are spending significant amounts of time on screens daily.

His stance on a potential social media ban for young people has shifted, as he has previously opposed such measures, citing difficulties in enforcement and the risk of pushing teenagers towards the dark web. The conversation around social media safety has been further fueled by tragic incidents, such as the case of Jools Sweeney, a 14-year-old who died following an online challenge. His mother has advocated for increased parental rights to access their children’s social media accounts in the event of their death.

In parallel, the European Parliament is pushing for age restrictions on social media platforms, while Denmark is considering a ban for individuals under the age of 15. TikTok has confirmed that its new technology has been developed in compliance with EU regulatory requirements in collaboration with Ireland’s Data Protection Commission, the lead privacy regulator in the EU.

A previous investigation by The Guardian in 2023 revealed that moderators were instructed to permit under-13 users to remain on the platform if they claimed parental supervision over their accounts. This has raised further questions about the efficacy of age verification measures across social media platforms.

As TikTok moves forward with its age-verification system, the company aims to align with the increasing scrutiny from governments and regulators, signaling a commitment to enhancing user safety and addressing public concerns regarding children’s online activities.