Technology
Danish Kapoor
Danish Kapoor

The European Union found Meta’s child protection measures inadequate

The European Union is approaching the view that Meta has not taken sufficient measures to protect child users on its Facebook and Instagram platforms. According to the initial findings of the investigation carried out by the European Commission, it is considered that the company may have violated the Digital Services Act (DSA). According to the information provided by the Financial Times, it is stated that current systems are insufficient, especially in preventing users under the age of 13 from accessing the platforms. This could cause Meta to face a serious risk of sanctions across Europe. The review process has not been completed yet, but the current situation increases the pressure on the company.

The commission’s findings reveal significant flaws in Meta’s user age verification processes. One of the notable problems is that age information can easily be misrepresented when registering on platforms. In addition, the complex nature of the process of reporting users under the age of 13 is also criticized. On the other hand, it is stated that Meta’s risk assessment regarding children’s use of the platform is “incomplete and inconsistent”. The European Commission argues that the company’s current approach does not match the data on the ground.

According to data shared by the Commission, approximately 10 to 12 percent of children under 13 across the European Union use Instagram or Facebook. Despite this, it is stated that this prevalence is not taken into account sufficiently in Meta’s evaluations. In addition, it is stated that although scientific studies reveal that younger users are more vulnerable to negative effects caused by social media, these data are ignored. This paves the way for regulators to demand stricter measures against the platforms.

European Union demands stronger age verification measures from Meta

The European Commission wants Meta to both strengthen its age detection mechanisms and reconsider its risk assessment methods. Otherwise, it is stated that the company may face a fine that can reach up to 6 percent of its global annual revenue. However, the process has not yet reached the final decision stage. Meta has the right to respond to the Commission’s findings and take steps to correct any problems identified.

In its statement, Meta emphasizes that Facebook and Instagram are designed for users aged 13 and over. The company states that it uses various systems to detect and remove users under the age limit. In addition, it is stated that we continue to invest in technologies in this field and new measures will be put into effect soon. Still, current criticism raises questions about the adequacy of these measures.

The European Commission’s investigation into Meta was launched in 2024 and focused especially on the risk of children’s social media addiction. By 2026, regulators and technology companies are looking for a common solution around age verification technologies. Despite this, it is known that these technologies bring with them some discussions in terms of privacy. The age verification application developed by the European Union stands out as a model that can be a reference point for both member states and technology companies. These developments indicate that the responsibilities of digital platforms regarding child safety will be more strictly monitored in the coming period.

📡 Follow Teknoblog
In order not to miss the technology agenda, 📰 add it to Google News, 💬 join our WhatsApp channel, ▶ subscribe to YouTube, 📷 follow us on Instagram and 𝕏 X.

Danish Kapoor