European Union regulators have formally charged Meta Platforms, the parent company of Facebook and Instagram, with failing to adequately protect children on its platforms. The European Commission, the EU’s executive arm, announced on Monday that it has opened proceedings against Meta, alleging that the company has violated the bloc’s Digital Services Act (DSA). This landmark legislation, which came into effect in February 2024, imposes stringent obligations on online platforms to combat illegal content and protect users, particularly minors.
The core of the European Commission’s concern centers on Meta’s alleged inability to implement effective age verification mechanisms. Regulators have pointed to the company’s failure to have robust controls in place to check a user’s self-declared date of birth. This oversight, according to the Commission, directly contravenes the DSA’s requirement for platforms to take appropriate and proportionate measures to ensure users are not exposed to illegal content. Information reaching TahirRihat.com suggests that the investigation will scrutinize how Meta handles the personal data of minors and whether its systems are designed to prevent children from accessing inappropriate content or engaging in harmful online interactions.
The Digital Services Act is designed to create a safer online environment for all users within the European Union, with a particular emphasis on safeguarding vulnerable groups. Under the DSA, large online platforms like Meta are required to conduct risk assessments and implement measures to mitigate systemic risks, including those related to the dissemination of illegal content and the protection of minors. The European Commission’s decision to open formal proceedings indicates that it has gathered sufficient evidence to believe that Meta may be in breach of these obligations. This development marks a significant escalation in the EU’s efforts to hold Big Tech companies accountable for the societal impact of their services.
The investigation will delve into whether Meta’s design choices and operational practices on Instagram and Facebook inadvertently facilitate the exposure of children to harmful content. This could include issues such as cyberbullying, exploitative content, or content promoting self-harm. The Commission’s statement highlighted that the DSA specifically prohibits the processing of personal data of a user under 13 years of age without parental consent, and that platforms must implement effective enforcement measures to detect and address such violations. The lack of effective controls for verifying users’ ages is seen as a critical failing in this regard.
Meta has been given an opportunity to respond to the allegations, and the Commission will consider the company’s submissions as part of its investigation. The proceedings could lead to substantial fines if Meta is found to be in violation of the DSA. Such penalties can amount to up to 6% of a company’s annual global turnover, a figure that could run into billions of dollars for a company of Meta’s size. This regulatory pressure underscores the growing scrutiny faced by social media giants regarding their responsibilities in shaping the online experiences of their users, especially the young and impressionable.
The European Commission’s action is part of a broader trend of increased regulatory oversight of technology companies across the globe. Many jurisdictions are grappling with the challenges posed by the internet’s pervasive influence and the potential for harm, leading to the development of new legal frameworks aimed at ensuring greater accountability. The DSA, in particular, represents a comprehensive effort by the EU to set global standards for online safety and platform responsibility. The outcome of this investigation into Meta’s child safety measures will likely have far-reaching implications for how social media platforms operate and are regulated in the future, not just within the EU but potentially influencing global best practices.
Sources indicate to TahirRihat.com that the investigation will also examine the effectiveness of Meta’s content moderation policies and enforcement mechanisms as they pertain to child protection. While Meta has previously stated its commitment to child safety and has implemented various tools and policies, the EU regulators appear unconvinced that these measures are sufficiently robust or effectively implemented to meet the requirements of the DSA. The focus on age verification is particularly crucial, as it forms a foundational layer for many other child protection measures. If platforms cannot reliably determine a user’s age, their ability to tailor content, enforce age-specific restrictions, and prevent harmful interactions is significantly compromised.
The European Commission’s statement specifically mentioned that Meta has not provided sufficient evidence to demonstrate that it has taken appropriate and proportionate measures to mitigate the risks of harm to minors. This suggests that the company’s current efforts, such as relying on user self-declaration of age, are deemed insufficient by the EU’s digital watchdog. The investigation will likely involve a detailed review of Meta’s internal systems, algorithms, and policies related to age assurance and the protection of children’s data. The implications of this probe extend beyond just fines; it could lead to mandated changes in how Meta designs and operates its services, potentially impacting its business model and user experience across its vast network of platforms.
The scrutiny of Meta’s child safety practices is not new, but the formal opening of proceedings under the DSA signifies a more serious and legally binding phase of the regulatory process. The Commission’s move is a clear signal that it expects platforms to proactively address potential harms rather than reactively respond to incidents. The digital landscape is constantly evolving, and regulators are keen to ensure that the legal frameworks keep pace with technological advancements and the emerging challenges they present, particularly concerning the well-being of younger generations who are often the most avid users of these digital services.
The European Commission’s decision to initiate formal proceedings against Meta underscores the significant responsibilities that large online platforms bear in safeguarding their users. The DSA represents a robust attempt to bring order and accountability to the digital realm, and this investigation into Meta’s child protection measures will be a crucial test case for the effectiveness of this new regulatory regime. The outcome will be closely watched by other technology companies, regulators worldwide, and civil society organizations advocating for a safer and more responsible internet for children.

Tahir Rihat (also known as Tahir Bilal) is an independent journalist, activist, and digital media professional from the Chenab Valley of Jammu and Kashmir, India. He is best known for his work as the Online Editor at The Chenab Times.



