Meta ‘violating EU law’ by not keeping minors off Facebook, Instagram
The European Commission has found that Meta breached EU law by failing to prevent under-13s from accessing its platforms, as scrutiny of the tech giant’s handling of child safety intensifies.
The commission said Wednesday that its preliminary investigations concluded that Meta violated the EU’s Digital Services Act because the minimum age requirement of 13 for Instagram and Facebook is not adequately enforced.
When creating an account, minors can input a false birth date, with no controls in place to verify it, the Commission said.
Additionally, the tool for reporting a minor’s account is “difficult to use” and requires up to seven clicks to access the form, the commission said. Even when a minor’s account is reported, the commission found that there are often no adequate follow ups or measures to get them off the platform.
“The Commission considers that Instagram and Facebook must change their risk assessment methodology, in order to evaluate which risks arise on Instagram and Facebook in the European Union, and how they manifest,” the commission said in its announcement.
A Meta spokesperson told CNBC: “We disagree with these preliminary findings. We’re clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age.
“We continue to invest in technologies to find and remove underage users and will have more to share next week about additional measures rolling out soon. Understanding age is an industry-wide challenge, which requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this important issue.”
Meta can now review the Commission’s preliminary investigation findings and respond in writing.
If the commission’s preliminary findings are confirmed by its final investigation, it can fine Meta up to 6% of its total worldwide annual turnover.
The commission’s preliminary findings come after two high-profile U.S. court rulings in March: one found that aspects of its platform design contributed to addiction and mental health harms among teenagers, while the other concluded that the company misled users about children’s safety on its platforms.
<