The
European Union has issued a preliminary finding that Meta is failing to prevent
children under the age of 13 from accessing Facebook and Instagram, a breach
that could lead to massive financial penalties. Under the Digital Services Act
(DSA), the tech giant faces potential fines of up to six percent of its total
global annual turnover if it does not strengthen its age-verification and
removal measures.
An investigation by the European Union
has identified several critical failures in Meta’s current systems regarding
the protection of minors. Regulators discovered that the platforms lack
effective age controls, allowing children to bypass restrictions simply by
entering false birth dates. Furthermore, the process for reporting underage
users was found to be overly complex, requiring up to seven clicks to access
the reporting form.
The EU also highlighted a discrepancy
in Meta’s internal risk assessments, which underestimated the presence of
children under 13 on its platforms despite evidence showing they make up 10 to
12 percent of the user base. These failures have led to significant safety
concerns, with officials warning that young users are being exposed to
"age-inappropriate experiences," ranging from violent content to bullying
and harassment.
EU
tech chief Henna Virkkunen emphasized that terms and conditions must be backed
by "concrete action" rather than being "mere written
statements". While Meta has publicly stated that its platforms are
intended for those 13 and older and claims to have measures in place to remove
underage accounts, the EU is demanding more robust enforcement.
This move is part of a broader European crackdown on Big Tech to protect minors, which includes exploring a bloc-wide age limit for social media and the development of an EU-wide age-verification app expected by the end of 2026. Meta now has the opportunity to respond to these findings or offer remedies to avoid the looming fines.
Join the Rehaab Online WhatsApp group for timely updates (Click here to join the group)
