Roblox and Discord are being sued by the mother of a 15-year-old boy who died by suicide after allegedly being targeted by “an adult sex predator” posing as a child on the platforms.
A wrongful death lawsuit was filed in San Francisco Superior Court by Becca Dallas, according to The New York Times. Her son, Ethan Dallas, joined the online gaming platform Roblox with his parents’ approval and with parental controls in place.
At age 12, he was allegedly targeted by an online predator posing as a child named Nate, now believed to be 37-year-old Timothy O’Connor, per The Times. Their conversations moved to Discord and turned sexual in nature, with “Nate” threatening Ethan into sharing sexually explicit images.
“Tragically, Ethan was permanently harmed and haunted by these experiences, and he died by suicide at the age of 15,” the complaint said. His mother, Becca, is seeking a jury trial and compensatory damages.
The lawsuit highlights a long-standing danger of online spaces, particularly those targeted at children but where adults are also able to hang around. The complaint argues that if either platform had enforced more stringent safety protocols, “Ethan would have never interacted with this predator, never suffered the harm that he did, and never died by suicide.”
A Roblox spokesperson told Fast Company in an emailed statement that the company was “deeply saddened by this tragic loss” but could not comment on active litigation. “Safety is a top priority,” the statement continued, “and we are continually innovating new safety features—over 100 this year alone—that protect our users and empower parents and caregivers with greater control and visibility.”
Roblox said it has introduced several tools designed to better protect its young users, including an age-estimation feature set to roll out platform-wide by year’s end. The spokesperson added: “Our policies are purposely stricter than those found on other platforms, including limiting chat for younger users, not allowing sharing images through chat and filters designed to block the sharing of personal information.”
The company added that child safety online is an “industry-wide issue” and highlighted its partnerships with law enforcement and global child safety and mental health organizations to combat exploitation.
A spokesperson for Discord told Fast Company: “Discord is deeply committed to safety and we require all users to be at least 13 to use our platform. We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies. We maintain strong systems to prevent the spread of sexual exploitation and grooming on our platform and also work with other technology companies and safety organizations to improve online safety across the internet.”
Anapol Weiss, the firm that filed Dallas’s suit, said this is the ninth lawsuit it has filed in connection with allegations that children were groomed, exploited, or assaulted after contact on Roblox or related platforms.
“This case lays bare the devastating consequences when billion-dollar platforms knowingly design environments that enable predators to prey on vulnerable children,” said Alexandra Walsh, partner at Anapol Weiss, in a statement. “These companies are raking in billions. Children are paying the price.”
The application deadline for Fast Company’s Most Innovative Companies Awards is Friday, October 3, at 11:59 p.m. PT. Apply today.