https://cdn.arstechnica.net/wp-content/uploads/2022/10/GettyImages-1235736525-760×380.jpg
Through the pandemic, the user-created game platform that’s so popular with kids, Roblox, expanded its user base and decided to go public. Within two years, its value shot from less than $4 billion to $45 billion. Now it’s being sued—along with Discord, Snap, and Meta—by a parent who alleges that during the pandemic, Roblox became the gateway enabling multiple adult users to prey on a 10-year-old girl.
The lawsuit filed Wednesday in the San Francisco Superior Court shows how sexual predators can exploit multiple social platforms at once to cover their tracks while financially and sexually exploiting children. It alleges that, in 2020, Roblox connected a young girl called S.U. with adult men who abused her for months, manipulating her into sending payments using Roblox currency called Robux and inducing her to share explicit photos on Discord and Snapchat through 2021. As the girl grew increasingly anxious and depressed, the lawsuit alleges that Instagram began recommending self-harm content, and ultimately, S.U. had to withdraw from school after multiple suicide attempts.
Like many similar product liability lawsuits that social platforms have recently faced for allegedly addicting children and causing harms, this new lawsuit seeks to hold platforms accountable for reportedly continuing to promote the use of features that tech companies know can pose severe risks for minor users. And S.U.’s guardian, known as C.U. in the lawsuit, wants platforms to pay for profiting off systems that allegedly recklessly engage child users.
The lawsuit says that platforms neglect to prevent predator access to minors, suggesting cheap and simple fixes that platforms overlook because they’d potentially limit profits. These suggestions include warning minors about potential predatory engagement, verifying the age of account holders, restricting adult users from messaging minor users, banning adult users who message minors, or preventing minors from circumventing parental oversight by limiting minor’s access to certain features and abilities to create duplicate accounts.
A Roblox spokesperson told Ars, “While we do not comment on pending litigation, Roblox has a safety-first culture and works tirelessly to maintain a platform that is safe and civil for all.” Roblox also says it has a zero-tolerance policy for users “endangering or sexualizing children in any way” and takes “swift action against anyone found to be acting in breach of our Community Standards.”
Discord told Ars it scans every image on its platform to detect child sexual abuse materials and, echoing Roblox, said it has a zero-tolerance policy and takes immediate action when it becomes aware of child endangerment or sexual exploitation. “This includes proactively investigating and banning users, shutting down servers, and making targeted efforts to detect and disable accounts that violate our Terms of Service and Community Guidelines.”
Snap did not immediately provide comment to Ars.
Meta told Ars that it cannot comment on active litigation but says its “deepest sympathies are with anyone affected by these difficult and complex issues.”
Meta has, arguably, faced the most criticism on this issue, ever since whistleblower Frances Haugen told the US Senate how Facebook knowingly harmed young users. A Meta spokesperson told Ars that it has implemented changes on Instagram that are similar—though seemingly weaker—to the changes the lawsuit seeks.
“Teens automatically have their accounts set to private when they join Instagram, adults can’t message teens that don’t follow them, we don’t show accounts belonging to teens to some adults in places where we suggest content, and we have controls designed to limit the types of content teens see,” a Meta spokesperson told Ars.
As a result of S.U.’s experiences on Roblox, Discord, Snapchat, and Instagram, the girl’s guardian C.U. has since had to quit her “dream job” with the government, sacrificing benefits and a pension, to attend to S.U.’s escalated care needs. So far, C.U. says in the lawsuit that she has gone into $10,000 in debt from healthcare co-pays and that S.U. continues to need care for ongoing health issues.
The lawsuit seeks damages from social platforms to help recover the costs of S.U.’s medical care, as well as monetary damages for S.U.’s future care, C.U.’s income loss and future earning capacity, punitive damages, and more, to be determined, they’ve demanded, by a jury.
Ars Technica – All content