Roblox’s face scanning system, which estimates people’s age before accessing the platform’s chat functions, was rolled out in the US and other countries around the world last week, after launching in some locations in December. Roblox says it is implementing the system to allow users to safely chat with users of the same age.
But players are already in revolt as they can no longer chat to their friends, developers are demanding Roblox to withdraw the update, and importantly, experts say that not only is the AI mistaking aging young players as adults and vice versa, the system does little to help solve the problem it was designed to tackle: a flood of predators using the platform to groom young children.
In fact, WIRED found several examples of people advertising age-verified accounts for minors under the age of 9 on eBay for just $4.
After WIRED flagged the listings, eBay spokesperson Maddie Martinez said the company was removing them for violating the site’s policies.
In an email, Roblox Chief Security Officer Matt Kaufman told WIRED that a change of this magnitude takes time on a platform with more than 150 million daily users.
“You can’t flip a switch when creating something that didn’t exist before,” he said. “To expect the system to become flawless overnight is to ignore the scale of this undertaking.”
Kaufman said the company is pleased with the move, noting that “millions of users” have already verified their age, claiming that “the vast majority of our community values a safer, more age-appropriate environment.”
The company addressed some of the criticism in an update on Friday, writing: “We are aware of instances where parents verify age on behalf of their children, causing the children to be 21+. We are working on a solution to address this and will share more here soon.”
Roblox announced the age verification requirement last July as part of new features designed to make the platform safer. The company has come under considerable pressure in recent months as multiple lawsuits allege that the company failed to protect its youngest users and helped groom children to be predators.
The attorneys general of Louisiana, Texas and Kentucky also filed lawsuits against the company last year making similar claims, while the Attorney General of Florida issued a criminal subpoena to assess whether Roblox is “assisting predators in accessing and harming children.”
Roblox claims that requiring people to verify their age before allowing them to chat with others will prevent adults from freely interacting with children they don’t know.
Although this process is optional, refusing to do it means that a person will no longer have access to the chat function of the platform, a major reason why most people use Roblox.
To verify their age, people are asked to take a short video using their device’s camera, which is processed by a company called Persona to estimate their age. Alternatively, users can upload a government-issued photo ID if they are 13 years of age or older.
Roblox says that all personal information is “deleted immediately after processing.” However many users online say they are not willing to undergo age verification due to privacy concerns.
People who have verified their age are only allowed to chat with a small group of other players around their age. For example, verified players under the age of 9 can only chat with players up to the age of 13. Players who are considered to be 16 years old can chat with players who are between 13 and 20 years old.
<a href