Instagram and Facebook are breaking the EU’s illegal content rules, according to the European Commission’s preliminary findings under the Digital Services Act (DSA). The report claims Meta’s social platforms have failed to comply with several transparency and moderation obligations designed to protect users across Europe.
The European Commission’s initial investigation suggests that Meta, and to a lesser extent TikTok, are not meeting DSA requirements. The platforms reportedly create “confusing” barriers for users trying to flag illegal content or appeal moderation decisions. Officials say Meta’s design choices — described as “dark patterns” — make it harder for users to report harmful material such as terrorism-related posts or child sexual abuse content.
TikTok and Meta were also cited for having restrictive procedures that limit researchers’ ability to access public data, which undermines transparency commitments under the DSA.
The Commission’s statement warns that Meta and TikTok could face fines of up to 6% of their global annual revenue if they fail to resolve these issues. That could translate to billions of dollars for Meta, depending on the final ruling.
Both companies now have the chance to respond to the allegations, challenge the findings, or take corrective measures before a formal decision is made. The EU’s next move will depend on whether Meta can demonstrate meaningful changes to its content moderation and transparency systems.
The DSA, which came into full effect in 2024, holds large online platforms accountable for managing illegal and harmful content. The law requires companies like Meta and TikTok to provide clear reporting tools, safeguard user rights, and share access to data with independent researchers.
The EU argues that by making it difficult for users to report illegal activity or understand moderation decisions, Meta is violating the DSA’s core principles of user safety and transparency.
Meta has yet to release a detailed statement, but it’s expected to defend its moderation systems and claim compliance with EU standards. If found guilty, however, the case could force major operational changes for both Facebook and Instagram in Europe.
The Commission’s action underscores a growing effort by European regulators to hold tech giants accountable for their influence on public discourse and online safety.
For everyday users, the case highlights how social media platforms handle harmful or illegal content behind the scenes. If Meta and TikTok are found in violation, they could be required to redesign user interfaces to make reporting simpler and more transparent.
This development could also push other tech companies — including YouTube and X (formerly Twitter) — to reassess their moderation and data transparency practices before the EU turns its attention to them.
The EU’s ongoing probe sends a clear message: no platform is above European digital law. As the Digital Services Act gains traction, companies that fail to comply risk steep financial penalties and stricter oversight.
Instagram and Facebook are breaking the EU’s illegal content rules, but the outcome of this case will determine how far Europe is willing to go to enforce accountability in the digital age.
𝗦𝗲𝗺𝗮𝘀𝗼𝗰𝗶𝗮𝗹 𝗶𝘀 𝘄𝗵𝗲𝗿𝗲 𝗿𝗲𝗮𝗹 𝗽𝗲𝗼𝗽𝗹𝗲 𝗰𝗼𝗻𝗻𝗲𝗰𝘁, 𝗴𝗿𝗼𝘄, 𝗮𝗻𝗱 𝗯𝗲𝗹𝗼𝗻𝗴. We’re more than just a social platform — from jobs and blogs to events and daily chats, we bring people and ideas together in one simple, meaningful space.
