The new approach is part of a series of measures Meta has adopted as part of an AI-based security strategy designed to fix the limitations of traditional methods, which rely heavily on self-reported age. With this change, the company wants to reduce the ease of minors’ access to platforms that, in theory, are restricted to them.
In a press release, Meta explained that it is applying a number of tools to identify relevant indicators that allow to estimate a person’s age. The process involves analyzing posts, comments, bios and descriptions, with particular attention to references related to school years or birthday celebrations – elements that may give clues about the real age of the person managing the account.
These tools are in addition to automated analysis techniques that aim to detect physical traits from imagery shared on Meta’s social platforms. These include characteristics such as height and bone structure. Meta is careful to stipulate that this system is not facial recognition, as it does not try to identify specific individuals in images or videos. Instead, the company notes that, “by combining these visual insights with our analysis of text and interactions, we can significantly increase the number of underage accounts we identify and remove.”
If, based on these elements, Meta suspects that an account is managed by a child under 13, it will be suspended. The User must re-verify his/her age using the procedures established by the Company to regain access; Otherwise, the profile will be permanently deleted.
Meta also announced that it will expand the scope of its technology to detect users aged 13 to 15 and automatically assign them teen accounts. This type of profile includes content restrictions and parental controls enabled by default for the purpose of providing a safe environment for this age group.
Meta began implementing age-verification technology in 2024 for Instagram users in the United States, Australia, Canada, and the United Kingdom. Now, this mechanism will be extended to Instagram accounts in Brazil and 27 EU countries. Additionally, these practices will first be applied to Facebook users in the US, with plans to expand to the EU and UK next month.
all adults look
The new measures have been interpreted as a response to a preliminary ruling recently issued by the European Commission, which concluded that the Mark Zuckerberg-led company is in breach of the Digital Services Act for allegedly failing to effectively prevent children under 13 from using its platforms. The EU body found that the company lacked sufficiently effective mechanisms to prevent such access and that its existing system for identifying and suspending accounts under the age limit was inadequate.
These criticisms are supported by the results of a survey conducted by the non-profit Internet Matters. After surveying almost 1,300 children and their parents in the UK, the study revealed that almost one-third of children have successfully avoided government-imposed restrictions on access to social networking sites. In some cases, the methods adopted are particularly effective.
The report is titled, “Online Safety Act: Are Children Safe Online?” It was revealed that 46 percent of children aged 9 to 16 believe that it is very easy to avoid age controls. However, overall only 32 percent admitted to breaking the rules.
<a href