Skip to content
  1. Index
  2. » News

Roblox Age Estimation System Restricts Child–Adult Communication Chats

Posted by nutcrackr on

Roblox is implementing a facial age-estimation system intended to stop children and adults from communicating across the platform — trusted connections as the exception — one of its most extensive safety changes to date. The update follows a series of controversies throughout 2025, as the company faces lawsuits, parent complaints, and regulatory pressure concerning the safety of minors in its online spaces. For Roblox, the scale of these concerns has reached a point where it can’t afford to roll the dice on older safety strategies.

The platform, known for popular experiences such as Dress to Impress, Steal a Brainrot, and Grow a Garden, has been repeatedly criticized for exposing young users to potential grooming and inappropriate content. Lawsuits filed by U.S. states, including Louisiana and Texas, accuse the company of failing to protect minors. One family lawsuit stated that a father “set up every parental control I could find,” yet alleged that his 13-year-old son was targeted by an adult posing as a 16-year-old, who reportedly offered gift cards in exchange for sexually explicit photos. Additional state filings claim Roblox allowed user-created experiences that included “explicitly sexual content (often involving pedophilia),” “Nazi propaganda,” and “violent extremism.” Roblox responded to these allegations by stating it maintains a “commitment to keeping kids and teens safe” and expressed “disappointment” that Texas pursued a lawsuit rather than collaboration.

To address safety concerns, Roblox announced a facial age-estimation program that requires users to capture an image or video through the app. The company says the media is processed by an external provider and deleted immediately after verification. Once processed, the system assigns users to specific age groups: under 9, 9 to 12, 13 to 15, 16 to 17, 18 to 20, or 21 and above. Users can chat only within their assigned group or within similar ranges deemed appropriate. Under these rules, a player estimated to be 10 years old cannot communicate with anyone above 12. Chat is disabled by default for users under nine in user-created experiences unless parents permit completing an age check. Under-13 players continue to face restrictions on private messages and selected chat features unless approved by a parent.

Matt Kaufman, Roblox’s chief safety officer, said the age-estimation technology is “pretty accurate” and can estimate ages “within one to two years” for users between five and 25. The company states it will be the first large gaming platform to require facial age verification for chat access. The rollout begins in December in Australia, New Zealand, and the Netherlands, with global implementation set for January.

This update follows changes made in September requiring all Roblox experiences to carry age ratings. Unrated games were removed, and younger users were blocked from all mature-rated content. Roblox also enforces chat filtering for users under 13 to block inappropriate or personal information, prohibits sharing images and videos in chat, restricts external links, and plans to require age checks to access social-media links on user profiles.

Regulators have taken an interest in the platform’s safety measures. In the U.K., the Online Safety Act establishes strict rules for protecting children online. Ofcom, the communications regulator, is responsible for enforcement. Anna Lucas, Ofcom’s online safety supervision director, said she was pleased with the addition of age-checking requirements. In Australia, the update arrives shortly before the country introduces a social-media ban for under-16s, and the government faces pressure to consider gaming platforms such as Roblox in that policy.

Roblox CEO Dave Baszucki discussed safety during a March BBC interview, stating the company “went to great lengths” to protect children. He also encouraged parents to rely on their instincts, saying, “My first message would be, if you’re not comfortable, don’t let your kids be on Roblox… I would always trust parents to make their own decisions.”

Child-safety advocates have called for stronger action. According to Rani Govender, policy manager for child safety online at the NSPCC, children faced “unacceptable risks” that left many vulnerable to harm. She supported Roblox’s announcement but said the platform should ensure meaningful improvements that prevent adults from targeting young users.

Public pressure also surfaced within Roblox itself. ParentsTogether Action and UltraViolet held a virtual protest inside the platform, delivering a digital petition with more than 12,000 signatures stating, “Roblox must stop being a playground for predators.”

Meanwhile, rising tension within the community produced additional controversies. Roblox removed the CEO’s profile temporarily, coinciding with widespread criticism and ongoing legal disputes. The company drew further attention after banning a user known as Schlep, who attempted to identify predators on the platform. Roblox sent him a cease-and-desist order. Schlep later said he is working with former “Dateline” correspondent Chris Hansen on a documentary addressing Roblox’s child-safety issues. Hansen confirmed he interviewed both Schlep and Louisiana Attorney General Liz Murrill.

Concerns about communication flaws were reinforced when a BBC test earlier this year showed that a 27-year-old user and a 15-year-old user on unlinked devices could exchange messages. Roblox said many violations occur when users attempt to move conversations to external platforms.

The timing of these issues coincides with a period of major growth. Roblox reported more than 80 million daily players in 2024, with about 40% under 13. By the third quarter of 2025, daily active users increased to 151.5 million, a 70% rise from the previous year. In a July report, Roblox stated that 36% of its user base was under the age of 13.

Roblox also says it introduced 145 new safety protocols in 2025, including an AI-powered tool designed to identify early signs of child endangerment. The company plans to further expand parental controls, allowing parents to manage accounts and update a child’s age once verification is completed.