Discord is expanding its age verification system globally, requiring facial scans or government-issued ID uploads for full access beginning in early March. Users who decline verification will be relegated to a restricted mode with content filters, blocked stage channels, and limited direct messaging—effectively segmenting the platform by age without exception.

The update arrives as part of Discord’s broader safety initiatives, framed as a response to growing regulatory pressure and a commitment to protecting younger users. However, the implementation follows a recent security breach in October 2025, where a vendor partner exposed an estimated 70,000 age verification photos alongside names, usernames, emails, partial credit card details, and IP addresses. While Discord has since emphasized that facial scans are processed locally on devices and deleted immediately after age confirmation, the incident underscores ongoing risks in handling biometric data.

Key changes for users

Discord’s Global Rollout of Facial Scanning and ID Checks Signals a Shift Toward Mandatory Age Verification—With Risks
  • Facial scanning via video selfie or government ID upload for age verification.
  • No individual identification—only age estimation via AI.
  • Teen-by-default settings for unverified users, including content filters and disabled stage channels.
  • Restricted direct messaging and friend requests for unverified accounts.

Discord’s head of product policy has described the rollout as an extension of its existing safety framework, balancing teen protections with flexibility for verified adults. Yet the move aligns with a broader trend of mandatory age verification, including the UK’s Online Safety Act and Australia’s recent ban on under-16s from major social platforms—laws critics argue could over-censor content while driving users to unregulated alternatives.

Past attempts at age verification have proven vulnerable to bypasses, such as the use of games like Death Stranding to circumvent Discord’s system in 2025. Traffic data from the UK suggests compliance with age checks may also reduce engagement, with sites adopting verification seeing declines while unregulated platforms thrive. Discord acknowledges potential user loss but expects to offset it through alternative engagement strategies.

The shift raises questions about long-term privacy and the effectiveness of mandatory verification in an era where digital identities are increasingly scrutinized. With no clear path to repeal for similar laws globally, platforms may face a choice between compliance and losing users—or risking further breaches in handling sensitive data.