EU Investigates Meta’s Compliance with Digital Services Act
4 mins read

EU Investigates Meta’s Compliance with Digital Services Act

The European Union (EU) has once again set its sights on a major social media corporation, this time launching a formal investigation into Meta, the parent company of Facebook and Instagram. The Meta DSA Compliance Investigation, initiated under the new Digital Services Act (DSA), focuses on Meta’s measures, or lack thereof, to protect younger users on its platforms.

 

Overview of the Investigation

The EU Commission announced on May 16, 2024, that it is scrutinising Meta’s compliance with the DSA, specifically regarding the protection of minors. The Commission’s concerns revolve around whether Meta’s systems and algorithms contribute to behavioural addictions among children, as well as the effectiveness of age-assurance and verification methods employed by the company.

The DSA mandates that large social platforms operating within Europe must implement “appropriate and proportionate measures to protect minors”. This includes preventing elements that might stimulate behavioural addictions. The investigation aims to determine if Meta’s efforts align with these stringent requirements.

 

The EU Commission’s Statement

The EU Commission’s official statement highlighted the dual focus of the investigation. Firstly, there is concern that Facebook and Instagram’s algorithms may lead to addictive behaviours in children. Secondly, there is scepticism about the robustness of Meta’s age verification processes.

“Today, the Commission has opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors. The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’. In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta,” the statement read.

 

Meta’s Response

Meta has responded by asserting that it has implemented a wide array of tools and policies aimed at protecting children. The company expressed its willingness to cooperate fully with the EU Commission to address these concerns.

A Meta spokesperson stated, “We have developed and implemented numerous tools to ensure the safety of young users on our platforms. We look forward to engaging with the EU Commission to demonstrate our commitment to these protective measures.”

 

Potential Implications

The investigation into Meta is part of a broader regulatory push by the EU to ensure compliance with the DSA. The act, which came into effect recently, is designed to regulate large online platforms and ensure they adhere to standards that protect users, particularly minors.

The DSA imposes significant penalties for non-compliance, including fines of up to 6% of a company’s annual global turnover. For Meta, a company of its size, this could translate into billions of dollars. Therefore, the stakes are exceedingly high.

 

Previous Investigations and Broader Context

This investigation is not an isolated incident. The EU has already initiated probes into other social media giants like X (formerly Twitter) and TikTok. Additionally, Meta itself is under scrutiny for the distribution of Russian-originated disinformation on its platforms.

The EU’s rigorous enforcement of the DSA signals a shift towards stricter regulation of digital services. The intent is to bring social platforms in line with new legal frameworks designed to protect users and ensure transparency.

 

Challenges Ahead

Enforcing the DSA presents several challenges. There is a degree of interpretation involved in the application of these rules, which could lead to prolonged debates and legal challenges. Proving that specific algorithms cause behavioural addictions in children, for instance, is complex and requires substantial evidence.

Moreover, the dynamic nature of social media platforms means that regulatory measures must continuously evolve to keep pace with technological advancements and emerging threats. This ongoing evolution presents a persistent challenge for both regulators and the companies they oversee.

 

The EU Commission’s investigation into Meta underscores the heightened scrutiny that social media companies face under the DSA. As regulators tighten their grip, companies must adapt and demonstrate compliance with new standards designed to protect users, particularly vulnerable minors. Meta’s cooperation with the EU Commission will be crucial in determining the outcome of this investigation and setting a precedent for future regulatory actions.

In this evolving landscape, both the EU and major social platforms like Meta must navigate the complexities of regulation, user protection, and technological innovation. The findings of the Meta DSA Compliance Investigation will likely have far-reaching implications, influencing how digital services operate within Europe and beyond.

Leave a Reply

Your email address will not be published. Required fields are marked *