The European Commission has formally accused Meta (parent company of Facebook and Instagram) and TikTok of violating key provisions of the Digital Services Act (DSA), asserting that both companies have failed to effectively address illegal content, obstructed researchers’ access to platform data, and, in Meta’s case, implemented an appeals process that does not comply with regulatory requirements. The Commission has granted both companies an opportunity to remedy these shortcomings; failure to do so could result in fines of up to 6% of their global annual revenue.
With respect to Meta, the European Commission outlined two principal allegations:
- Inadequate mechanisms for reporting illegal content: The Commission noted that Facebook and Instagram’s current systems require users to complete “multiple steps” before being able to flag illegal material, such as child sexual abuse content (CSAM).
Furthermore, the interface employs “dark design patterns” that make the reporting process confusing and discourage users from submitting reports. These practices, the Commission stated, violate the DSA’s requirement that online platforms provide “easy-to-use” mechanisms for reporting illegal content. - Ineffective appeal procedures for content removals or account suspensions: Under the DSA, users must be given the ability to contest a platform’s decision to remove content or suspend an account. However, the Commission found that both Facebook and Instagram prevent users from “explaining their position” or “submitting evidence” during an appeal, severely limiting the process’s effectiveness and transparency.
In addition to these accusations against Meta, the Commission also charged that both Meta and TikTok have established overly complex systems and procedures that make it unreasonably difficult for researchers to access publicly available data, as required by the DSA.
The Commission emphasized that such restrictions hinder researchers’ ability to conduct critical studies—such as examining how minors are exposed to illegal or harmful content online—leaving them with incomplete or unreliable information. It reaffirmed that “granting researchers access to platform data is a mandatory transparency obligation under the DSA.”
According to the Act, Meta and TikTok will be permitted to review the Commission’s findings and respond either through written submissions or by implementing measures to bring their operations into compliance with the DSA.
Should the Commission ultimately determine that the companies remain non-compliant, they could face fines of up to 6% of their total global annual revenue.
Meta has insisted that it is already in compliance with the law, while TikTok has argued that the DSA’s requirements conflict with the General Data Protection Regulation (GDPR).
In a statement to the Financial Times, Meta asserted: “Since the DSA took effect in the EU, we have updated our content reporting options, appeals processes, and data access tools, and we believe these measures meet the legal requirements set forth under European law.”
TikTok, meanwhile, stated that it is reviewing the Commission’s findings but warned that the demand to relax data protection safeguards creates a direct conflict between the DSA and the GDPR. The company has called on regulators to provide clear guidance on how to reconcile these obligations.
Related Posts:
- EU Targets Musk’s X with Potential $1B Fine for Violating Digital Services Act
- Dutch Court Orders Meta to Fix Algorithmic Feed, Citing DSA Violation
- YouTube, Snapchat, TikTok Ordered to Reveal Recommender System Details Amid DSA Crackdown
- Uber secret system can remotely shut down the devices to obstruct the police
- Microsoft Defender for Identity Flaw (CVE-2025-26685) Allows Unauthenticated Privilege Escalation