Governments tighten social media age checks for children
Safety tech firm Privately said more than 40 countries have introduced, proposed, or formally reviewed laws that restrict children's access to social media, as governments move towards enforceable age controls.
The company's analysis links the shift to a tighter policy approach in several markets, including Australia's ban on under-16s using social platforms, planned restrictions in France for under-15s, and debate in the UK parliament.
Privately described age assurance as a growing legal obligation for social media services. It said policymakers and platforms now face questions about how to verify age without collecting identity documents, capturing personal data, or retaining biometric information.
Policy momentum
The company's analysis counts countries with enacted laws, draft legislation, formal government proposals, parliamentary inquiries, regulator-led consultations, or enforceable age verification or parental consent requirements that relate specifically to minors' use of social media.
Privately positioned the change as a move away from voluntary measures by platforms. It said the focus has shifted towards methods that regulators can enforce and that limit the amount of user data a platform collects and retains.
"The debate has moved from 'should platforms verify age?' to 'how do they do it?' and we're seeing a rapid shift toward enforceable age controls that provide data privacy guarantees," said Deepak Tewari, CEO, Privately SA. "With so many countries now actively regulating or reviewing children's access to social media, it makes reliable age assurance unavoidable. Facial Age Estimation technology (FAE) allows platforms to meet these requirements without asking users to share IDs, which is critical for both privacy and scale adoption."
Consumer trust
Privately also pointed to consumer sentiment research it commissioned in December 2025 about privacy and social media.
The research found low trust in platforms' handling of sensitive personal data. Privately said 13% of adults trust online platforms to protect biometric information such as facial images.
The polling also found higher acceptance for facial age estimation when privacy constraints apply. Privately said 39% of respondents support facial age estimation when it runs entirely on-device and images never leave the device.
The findings come as regulators in several jurisdictions weigh age assurance approaches that range from ID checks and third-party verification to estimation methods that assess age without storing identifiable information. Platforms also face operational choices about how to apply checks at scale across new accounts, existing accounts, and access to age-restricted features.
On-device approach
Privately markets facial age estimation that runs on a user's device. The company said the method does not store, manage, or share personal data as part of the age check process.
Privately said three of the ten largest social media platforms in Australia have deployed its on-device facial age estimation. It also said multiple large platforms in the UK use the technology.
Privately said its technology carried out more than five million age checks in 2025.
The company positioned its product as an alternative to ID checks. It said it avoids the storage of personal data. It also said it avoids excluding legitimate users, although it did not provide additional detail about how platforms handle edge cases such as borderline age results or users who fail an age estimation check.
"One of the primary reasons why these large platforms prefer us to other solutions is that we provide on-device age checks, which provide unmatched user data privacy guarantees as no user images ever leave the user's devices," said Tewari. "This means platforms using Privately can comply with increasing child-safety regulations while minimising data collection, addressing one of the primary user and regulatory concerns.
As more countries consider laws that restrict minors' access to social media, the market for age assurance tools is likely to draw further scrutiny from regulators and privacy advocates. Privately said the policy direction points towards wider adoption of methods that limit data collection while meeting enforceable age control requirements.