Discord — Lawsuit
Executive Summary
New Jersey Attorney General Matthew Platkin filed the first state lawsuit against Discord, alleging the platform violated the New Jersey Consumer Fraud Act by misleading parents about child safety features. The complaint cited Discord's failure to enforce its age-13 minimum, default settings that allowed anyone to send friend requests to minors, and a 'Safe Direct Messaging' filter that did not scan messages between friends by default.
What Happened
On April 17, 2025, New Jersey Attorney General Matthew Platkin filed a lawsuit against Discord, Inc. in the Superior Court of New Jersey, alleging violations of the New Jersey Consumer Fraud Act. The complaint, resulting from a multiyear investigation, claims Discord misled parents and children about the effectiveness of its safety controls and obscured risks children faced on the platform. Specific allegations include failure to enforce the age-13 minimum requirement, default settings that allowed anyone to send friend requests to minors, and a Safe Direct Messaging filter that did not scan messages between friends by default despite claims that it would automatically scan and delete explicit media content.
Who Is Affected
New Jersey children using the Discord platform and their parents are directly affected by the alleged deceptive practices. According to the lawsuit, children on the application were exposed to sexual and violent content, child sexual abuse material, and became vulnerable to online predators due to the platform's misleading safety settings and lax oversight. The complaint notes that children make up a significant portion of Discord's user base on what has become one of the most popular online social platforms globally.
Why It Matters
This represents the first state-level lawsuit against Discord regarding child safety practices and could establish legal precedent for how social platforms are held accountable for representations made about safety features. The case highlights the gap between marketed safety controls and actual protection provided to minors on messaging platforms. The lawsuit seeks unspecified civil penalties and aims to stop what the Attorney General characterizes as unlawful conduct that has made Discord a hunting ground for predators seeking access to children.
What You Should Do
Parents with children using Discord should immediately review and manually configure all privacy and safety settings rather than relying on defaults. Specifically, parents should check direct messaging settings to ensure the Safe Direct Messaging filter is enabled and understand that it does not scan messages between friends by default. Parents should also verify their child meets the minimum age requirement of 13 and consider whether continued use of the platform is appropriate given the alleged safety failures. Users can monitor the lawsuit proceedings to stay informed about any court-ordered changes to Discord's practices or additional revelations about platform safety.
AI-Assisted
Event summaries are generated by Claude AI from verified sources and reviewed by humans before publication.
Sources