The mother of a 15-year-old California boy who died by suicide is taking Roblox and Discord to court, arguing that their platforms enabled his grooming and exploitation. The lawsuit, filed by Rebecca Dallas in San Francisco, accuses the companies of operating in ways that exposed her son Ethan to online predators—a series of events she says ultimately led to his death. The suit describes Ethan as a "bright, imaginative boy who loved gaming, streaming and interacting with friends online," NBC News reports.
According to the complaint, Ethan began using Roblox at age 9, initially with parental controls in place. But at 12, it says, he was befriended by an adult posing as a child whose interactions with Ethan escalated from friendly chats to sexual content. The suit claims the predator persuaded Ethan to disable parental controls and shift their conversations to Discord, where he was coerced into sharing explicit material under threat. Ethan died in April 2024. Dallas' suit blames the platforms for not having adequate safeguards, age verification, or screening measures in place.
Saying the company is "deeply saddened" by the teenager's death, a spokesman said Roblox is working on new safety features, per the New York Times. Child safety issues are an industrywide issue, he said. Discord said it requires users to be at least 13. The messaging platform uses "advanced technology and trained safety teams to proactively find and remove content that violates our policies," a spokeswoman said. Ethan's teacher said he wanted to be a Disney Imagineer someday.
- The Times examines the case here.