
Roblox has introduced a new vary of security options directed particularly at youngsters ages 13-17, together with a new age estimation expertise that makes use of AI to guess a person’s age based mostly on a video selfie they submit.
Today’s announcement reveals a number of new options being carried out in Roblox that the corporate claims will enhance teen and little one security on its platform. On the core of the announcement are new options particularly for teenagers ages 13-17, giving them extra freedom on the platform than youthful kids however nonetheless lower than adults. Teenagers will have the ability to designate “trusted connections” on Roblox, with whom they may have the ability to chat on the platform with out filters. Per Roblox, the objective is to raised monitor conversations teenagers are having on Roblox moderately than having them lured to third-party platforms the place unmonitored conversations may develop into inappropriate.
Trusted connections are supposed to be set between customers who know each other effectively, and if a teen intends to set somebody 18+ as a trusted connection, they will solely accomplish that utilizing a QR code scanner or a contact importer.
Up to now, Roblox has relied on the submission of a authorities ID verifying that customers are 13+ or 18+ to unlock sure platform chat options. Nonetheless, it’s now implementing an various verification methodology. People can submit a “video selfie” to Roblox, and an AI will decide if it believes the individual in query is 13+ by analyzing it in opposition to “a massive, numerous dataset.” Google started testing a similar feature earlier this year, as did Meta the year prior.
Along with these adjustments, Roblox can also be including new instruments akin to on-line standing controls, a don’t disturb mode, and parental controls for folks who’ve linked their accounts to a teenage’s account.
Roblox has lengthy been in an uncomfortable highlight concerning its dealing with of youngsters’s security. In 2018, it made headlines when a mother reported her seven-year-old daughter’s Roblox character was violently sexually assaulted by different gamers in-game, and individually a six-year-old lady taking part in Roblox was reportedly invited into a “sex room”. In 2021, Folks Make Video games revealed a report on the methods during which Roblox’s business model allegedly exploits child labor. In 2022, Roblox confronted a San Francisco lawsuit accusing it of enabling the monetary and sexual exploitation of a 10-year-old lady. In 2023, it was sued each for allegedly facilitating “an unlawful playing ecosystem” and more generally for having lax little one security protocols that allegedly led to monetary loss and youngsters’s publicity to grownup content material. Simply final yr, Bloomberg revealed a damning report highlighting the prevalence of child predators on the platform. That identical yr, the platform claimed it reported over 13,000 incidents of kid exploitation to the Nationwide Middle for Lacking and Exploited Youngsters within the yr 2023, ensuing within the arrest of 24 people who allegedly preyed on kids via the sport.
“Safety has at all times been foundational to every little thing we do at Roblox,” mentioned Roblox chief security officer Matt Kaufman in a assertion alongside in the present day’s new function information. “Our objective is to guide the world in security and civility for on-line gaming. We’re devoted to supporting experiences which can be each deeply participating, and empowering for gamers of all ages, whereas repeatedly innovating how customers join and work together.”
Rebekah Valentine is a senior reporter for IGN. You could find her posting on BlueSky @duckvalentine.bsky.social. Received a story tip? Ship it to rvalentine@ign.com.
