Louisiana Attorney General Liz Murrill is suing popular online gaming platform Roblox, alleging the company prioritizes profits over protecting children from sexual predators and harmful content. The lawsuit claims Roblox “knowingly and intentionally” neglected to implement basic safety measures, leaving young users vulnerable to exploitation and abuse.
Murrill further accuses Roblox of failing to adequately warn parents about the potential dangers their children face while playing on the platform. On social media, Murrill condemned Roblox as a site where “violence against children and sexual exploitation for profit” thrives, citing examples like user-created games with titles like “Escape to Epstein Island” and “Public Showers.”
This legal action echoes similar lawsuits filed against other major social media platforms like Meta, TikTok, and Snapchat. These cases reflect a growing public concern about the online safety and well-being of young users in increasingly digital environments.
Roblox strongly refutes these allegations, stating that protecting its vast user base is paramount. In a statement released Friday, a company spokesperson emphasized their commitment to providing a safe platform through advanced technology, 24/7 human moderation, and rigorous safeguards designed to prevent inappropriate content and behavior. These measures include restrictions on sharing personal information, links, and user-to-user image sharing.
The company acknowledges the persistent challenge of “bad actors” attempting to exploit their systems but asserts their continuous efforts to block these attempts and refine their moderation approaches. Roblox highlighted its recent implementation of selfie-based age verification for teen users as another step towards enhancing safety. Notably, Murrill argues that a lack of robust age verification policies makes it easier for predators to interact with minors on the platform.
Despite these efforts, Roblox faces ongoing scrutiny. Earlier this year, the company faced a class action lawsuit from parents alleging Roblox falsely advertised its platform as safe for children. This led to significant changes in Roblox’s approach to safety, including new blocking tools, parental oversight features, and messaging controls. The company also joined other social media platforms in supporting the recently passed Take It Down Act, which aims to combat non-consensual intimate imagery sharing, including deepfakes.
This latest lawsuit underscores the immense pressure on online platforms like Roblox to effectively address child safety concerns in a rapidly evolving digital landscape.































