Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
KSMU is ending service to translator K255AH at 98.9 FM in Joplin. Learn more here.

Roblox bets on facial scanning to keep its youngest users safe

SCOTT DETROW, HOST:

The wildly popular online gaming platform Roblox has started rolling out facial age verification to enhance its security measures. The move follows a string of lawsuits alleging the platform has allowed predators to groom and sexually abuse children. Of the game's 80 million daily players, 40% are under the age of 13.

So how effective is this measure, and what can parents do to protect children from online abuse? We are joined now by Justin Patchin. He's a professor of criminal justice at the University of Wisconsin-Eau Claire and has studied child internet safety for decades. Welcome to ALL THINGS CONSIDERED.

JUSTIN PATCHIN: I'm happy to be with you, Scott. Thank you.

DETROW: Let's start out with this for people who do not have preteens in their lives. What's a quick overview of what Roblox is, why it's such a big deal?

PATCHIN: Well, Roblox is an online gaming platform, basically. It's an app that you can connect to that's got literally millions of games or experiences. And I think that's part of the problem, is there's so many different experiences on the platform, and it's hard to regulate all of them. So people - users can create their own games. They can play games created by others. They can interact with people all over the world. And some research from last year found that upwards of 80% of 9- to 12-year-olds have used Roblox.

DETROW: And the other context I wanted to get through before we get into the steps Roblox is taking is, is there really something specific about Roblox that makes this a place that so many predators are focused, or is it just the fact that this is a big waystation for so many younger kids to be using the internet?

PATCHIN: I think it's a couple of things. It is super popular right now, especially among that younger demographic. And so whenever you have, you know, kids just flocking to an environment like this, it's certainly going to attract the bad actors. So I also think it's become very popular very quick. And so I know Roblox is trying to do everything they can to ensure safety and to protect kids, but it's just blown up in popularity, so it's hard for Roblox to keep up.

DETROW: Yeah. Specifically, these new measures include requiring facial verification for players. Roblox says this will enable the platform to limit player interactions to the same age group. What do you make of that?

PATCHIN: It's an interesting idea. You know, as a researcher, I'm very curious to see how it'll work. And so I hope there's some efforts to evaluate its effectiveness of - you know, of sort of age-grouping these children.

DETROW: I mean, so much of this feels like lose, lose, lose for parents, right? I've got kids who are not quite Roblox age, but the idea of facial scanning makes me uncomfortable. Obviously, the idea of being in a place where there are bad actors makes me uncomfortable. And the idea of saying, no, you can't use this platform that so many other kids are using also feels like not the right move. Like, how do you suggest parents try to navigate this?

PATCHIN: It's true, it does seem like a big challenge for parents. It is a popular place. And to be honest, the vast majority of the games and experiences are perfectly fine. The vast majority of kids who use Roblox do just fine and aren't targeted by predators. And I think as a parent, we just need to be aware of these issues, certainly when our kids are first exploring Roblox or any social media platform, gaming platform. We need to be a part of that experience. We need to go on the app with them. We need to better understand what your kids are doing, who they're interacting with.

And when they're first starting out, it's actually pretty easy to block certain experiences, to restrict access to communication tools, meaning people aren't able to chat with them or message them. And so I think early on especially, limiting, you know, the game experience to the people you trust - family members, for example - and then sort of easing into more experience as they gain more kind of awareness of the potential for problems on these apps. So the big thing is parents need to be a part of the solution and a part of the experience early on.

DETROW: That's Justin Patchin, a professor of criminal justice at the University of Wisconsin-Eau Claire. Thank you so much.

PATCHIN: Thank you. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Ahmad Damen
Ahmad Damen is an editor for All Things Considered based in Washington, D.C. He first joined NPR's and WBUR's Here & Now as an editor in 2024. Damen brings more than 15 years of experience in journalism, with roles spanning six countries.
Kai McNamee
[Copyright 2024 NPR]
Scott Detrow is a White House correspondent for NPR and co-hosts the NPR Politics Podcast.