Roblox’s New Age Verification Feature Uses AI to Scan Teens’ Video Selfies

In the briefing, Kaufman called Roblox “one of the safest places online for people to come together and spend time with their friends and their family.”

Kirra Pendergast, founder and CEO of Safe on Social—an online safety organization operating worldwide—says Roblox’s latest safety measures are largely op-in, therefore putting “responsibility on minors to identify and manage risks, something that contradicts everything we know about grooming dynamics.” Features like machine learning age estimation tools, for example, can incorrectly categorize users as older or younger; in-person code scanning, she says, “assumes that in-person QR code scanning is inherently safe.”

“Predators frequently use real-world grooming tactics,” says Pendergast. “A QR scan doesn’t verify a trusted relationship. A predator could build trust online, then manipulate the child into scanning a QR code offline thus validating a ‘Trusted Connection’ in Roblox’s system. Real protection would require guardian co-verification of any connections not child-initiated permissions.”

Furthermore, says Pendergast, Trusted Connections only applies to chat, which leaves “large surface areas exposed making it a brittle barrier at best.”

When asked how an in-person QR code keeps minors safe from real-world grooming tactics, Kaufman echoes press briefing comment that there is no “silver bullet.” Instead, he says it’s many systems working together. “Those systems begin with our policies, our community standards,” Kaufman says. “It’s our product which does automated monitoring of things, it’s our partnerships, it’s people behind the scenes. So we have a whole suite of things that are in place to keep people safe. It is not just a QR code, or it is not just age estimation, it’s all of these things acting in concert.”

Kaufman says that Roblox is “going farther” than other platforms by not allowing kids age 13 to 17 to have unfiltered communication without going through Trusted Connections. “We feel that we’re really setting the standard for the world in what it means to have safe, open communication for a teen audience.”

According to Roblox’s briefing, the updates are part of Roblox’s typical development process and haven’t been “influenced by any particular event” or feedback. “It’s not a reaction to something,” Kaufman said. “This is part of our long term plan to make Roblox as safe as it can possibly be.”

Kaufman also tells WIRED that the heightened scrutiny and discussion of the game hasn’t had a dramatic impact on the company’s plans. “What we’re doing with this announcement is also trying to set the bar for what we think is appropriate for kids,” he says.

Looking at technology like generative AI, he says, “the technology may have changed, but the principles are still the same. We also look at AI as a real opportunity to be able to do some of the things that we do in safety at scale when we think about moderation. AI is central to that.”

In the briefing he said Roblox believes it’s important for parents and guardians to “build a dialogue” with their kids about online safety. “It’s about having discussions about where they’re spending time online, who their friends are, and what they’re doing,” Kaufman said. “ That discussion is probably the most important thing to do to keep people safe, and we’re investing in the tools to make that happen.” He added that Roblox is aware that families have different expectations on what’s appropriate online behavior. “If parents make a decision about what’s appropriate for their kid, it may not match the decisions that we might make, or I might make for my own family,” he said. “But that’s okay, and we respect that difference.”

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top