Discord assumes all users are underage, requires proof of age
Photo by appshunter.io (unsplash.com/@appshunter) on Unsplash
Discord will now assume every user is a minor, requiring them to prove they are an adult through an ID or facial scan, The Register reports. The new policy, set to roll out soon, will use an AI model to potentially infer a user's age, though the burden of proof ultimately falls on you.
Quick Summary
- •Discord will now assume every user is a minor, requiring them to prove they are an adult through an ID or facial scan, The Register reports. The new policy, set to roll out soon, will use an AI model to potentially infer a user's age, though the burden of proof ultimately falls on you.
- •Key company: Discord
The new "teen-by-default" safety settings and advanced age assurance systems are scheduled for a global rollout beginning in March 2026, according to a post on the Fosstodon AI Timeline. This positions Discord’s policy shift as part of a broader, industry-wide trend toward more stringent age-gating on social platforms.
The verification process itself will reportedly employ an artificial intelligence model designed to infer a user's age. According to multiple reports from The Register, this system could potentially allow some users to "wiggle out" of the more stringent verification requirements if the AI confidently determines they are an adult. However, the ultimate burden of proof will fall on the user; if the AI cannot make a determination or flags an account, individuals will be required to submit official identification or undergo a facial scan to verify they are over the age of majority.
This move by Discord is not happening in a vacuum. As noted in a Hacker News report, social media platforms have been "inching their way toward age verification for a while now." The report characterizes Discord’s new mandate as a significant "leap toward a gated internet," reflecting a growing pressure on tech companies to better protect younger users online, often in response to new regulations.
The policy raises immediate questions about privacy and data security, particularly in light of Discord’s recent history with third-party data incidents. TechCrunch reported a data breach affecting at least 70,000 users, which Discord clarified was not a direct breach of its own systems but rather a compromise at a third-party customer service vendor named 5CA. Requiring users to submit highly sensitive biometric data or government-issued ID would place Discord in possession of a new tier of critical personal information, inevitably intensifying scrutiny on its security practices.
The implementation details, such as how the AI inference model will operate or what specific forms of ID will be accepted, remain unclear based on the available reporting. Furthermore, the sources do not specify any potential exceptions or alternative verification methods for users who may not possess formal identification.
By adopting a stance where every user is presumed a minor, Discord is fundamentally altering the default trust model of its platform. This approach prioritizes safety and regulatory compliance but does so at the cost of user anonymity and ease of access, marking a pivotal moment in the platform's evolution from a niche chat app to a mainstream communication service grappling with the complexities of a global user base.