One year on, the new European Commission is abandoning human rights

Access Now Europe
Access Now
Published on 12/2/2025
View Original

Just over a year ago, as the new EU mandate was beginning, we laid out our desire to see EU leaders prioritize respect for digital rights in order to deliver sustainable prosperity, democracy, and social fairness for all. Sadly, we’ve been disappointed. As we break down in this blog post, the new European Commission has gone in the opposite direction, sidelining human rights in favour of bowing to industry lobbying demands and framing all policies through a security lens (also known as “securitization“).

Harmful deregulation under the guise of helpful simplification

Under the banner of “simplification”, the European Commission is pushing ahead with harmful deregulation. If this was only about cutting burdensome red tape or simplifying procedures while still upholding human rights protections, that would not necessarily be a bad thing. But the Commission’s simplification agenda is a blatant effort to strip away human rights safeguards, purely to benefit private companies’ profit lines.

The most egregious example of this can be seen in the recently released Digital Omnibus Package, which proposes a dangerous watering-down of legislation including the AI Act and General Data Protection Regulation (GDPR), among other digital policies. The AI Act only began to be implemented last year, and most of its provisions have yet to be tested in action — but already the Commission is (once again) bending to industry pressure to roll back its human rights protections, in a way that risks completely undermining the legislation’s core tenets. One proposal, for instance, would remove a key oversight mechanism in the high-risk classification process. During negotiations around the AI Act, the private sector successfully lobbied for a derogation allowing providers to unilaterally exempt themselves from all obligations for high-risk systems — but this came with the obligation for such exemptions to be registered and explained in a public database. The Omnibus proposes to remove this requirement entirely, allowing irresponsible actors to exempt themselves from the AI Act’s obligations without trace or accountability.

Meanwhile, efforts are underway to undermine the key parts of the GDPR, which is the cornerstone for protecting digital rights in the EU. The AI Act, for instance, only makes sense as an addendum to the fundamental protections and safeguards provided by the GDPR, with its risk-based approach requiring the rights-based foundation of the GDPR to be even remotely useful.

Despite this, the Commission is proposing amendments to core parts of the GDPR, proposing new legal bases for data processing and weakening the definition of personal data. This, in turn, will open the door to attacks on other key provisions and could weaken the entire edifice of digital rights protections in the EU.

Expanding surveillance on “security” grounds

Even as it pushes ahead with deregulation on the one hand, the Commission has, on the other, been rolling out wave upon wave of new legislation in the area of securitization and migration — showing its intent to deregulate all the things, except when it comes to policing and migration. Notably in the past year, we’ve continued to see migration policies used to cement Europe’s vision of a surveillance-based society, with the Commission expanding Europol’s power, digitalising its deportation regime, and creating an EU war budget which earmarks billions of euros for surveillance technology and “fully digitalised border control management”.

The European Parliament recently approved the Commission’s proposal to expand the powers of the EU’s police agency, Europol, in the name of fighting “migrant smuggling” — despite civil society warning that this plan will further criminalise and persecute migrant people — and this expansion is likely to continue into 2026, alongside also granting new powers to the EU’s border agency, Frontex. It also comes hot on the heels of the Commission’s proposal for a “Return Regulation” that would expand the detention and deportation of migrant people, including children, in the EU, via the increased use of digital surveillance infrastructure.

Expanding such infrastructure is also foreseen under the EU’s Multi-Annual Financial Framework 2028-2034, laid out in July, and the ProtectEU internal security strategy, published in April, with the latter aiming to instate “technological solutions that would enable law enforcement authorities to access encrypted data in a lawful manner.” Creating a backdoor into encryption that only the ‘good guys’ can use is the stuff of science fiction, as we’ve said before; yet the Commission is pushing ahead as part of a wider “roadmap of lawful and effective access to data for law enforcement”, which also confirms increased funding for surveillance technology. Looking at all these initiatives together, one might wonder if the Commission read George Orwell’s Nineteen Eighty-Four as an instruction manual, rather than a cautionary tale.

Ignoring the increasing abuse of spyware

In parallel with its efforts to expand surveillance on security grounds, the Commission has been woefully slow to confront the EU’s spyware crisis. Despite this year’s revelations about the Italian government’s extensive use of Paragon spyware technology to surveil journalists, activists, and humanitarians, the Commission has yet to implement, or even respond to, recommendations on tackling spyware issued by the European Parliament three years ago. In the meantime, recent investigations have shown how EU innovation funds have flowed directly to commercial spyware vendors; with the Commission unable to offer any reassurances about safeguards or details on current funding when pressed by MEPs. In short, it looks like EU taxpayers are still bankrolling an industry that undermines democracy.

Enforcement of the Dual-Use Regulation, which governs the export of items with both civilian and military applications, including cybersurveillance exports, remains weak, while surveillance tools made in the EU continue to surface in repressive contexts. During the next evaluation of the regulation, planned for between 2026 and 2028, it is essential that the Commission finally close the loopholes that allow this to happen, by implementing transparency on licensing, increased sharing if and when such licenses are denied, and post-shipment checks that actually happen.

The EU digital rights agenda is also proving ill-equipped to protect civil society in exile from digital transnational repression. We have seen multiple instances of human rights defenders, journalists, and activists from non-EU countries targeted with spyware, phishing attacks, and online harassment while within EU borders; yet EU institutions have offered little in the way of redress for victims or accountability for perpetrators. The recent European Democracy Shield announcement, for instance, failed to highlight spyware abuses as a threat to democracy but promised to scale up efforts to provide anti-surveillance solutions to “civil society actors and journalists under authoritarian regimes” — suggesting that the Commission hasn’t admitted the full extent of abuse by and within EU member states.

Failing to hold platforms accountable

Another flagship piece of EU digital legislation failing to live up to its initial promise is the Digital Services Act (DSA), which was supposed to create “a safer digital space where the fundamental rights of users are protected.” Despite coming into full effect nearly two years ago, enforcement of key parts of the DSA’s transparency and procedural protections remains one of the major stumbling blocks.

While the Commission has launched investigations into and issued preliminary findings against several large platforms for failure to protect peoples’ rights, including through a lack of algorithmic transparency or access to effective remedy, no substantial enforcement outcomes have yet materialized. Similarly, significant gaps in identifying systemic risks remain, as reported by civil society and researchers from Germany and Romania. While tech companies have been quick to paint the DSA as a tool for censorship, we have long made clear that censorship is neither the purpose nor the function of the regulation. In contrast, a key aspect of the DSA is supporting people’s ability to appeal takedowns or censorship of their content.

This has profound implications for the EU’s wider global reputation as a defender of fundamental rights and freedoms. The European Commission must enforce the DSA in a way that upholds these rights and freedoms, and prevents censorship, as a counterweight to other regions’ opaque authoritarian models. Otherwise the EU risks sending a message of much talk but no action; opening the door to future abuses.

At the same time, EU regulators must also remain vigilant about how their legislative efforts, including the DSA, are co-opted by third-country authoritarian regimes to justify censorship measures and target dissenting voices and independent journalists. We have already raised concerns about problematic bills in Brazil and Costa Rica, where vague definitions of illicit content and content moderation could fuel arbitrary content removal, undermining freedom of expression and expanding censorship.

Heading towards a dangerous digital dystopia

The new Commission’s decision to prioritize deregulation and securitization above all else is taking the EU in a dangerous direction; one where human rights, once seen as fundamental for the European project, are being sidelined. This will not make people’s lives easier, nor keep them safer. Rather it will transform the EU into a digital dystopia, and ultimately undermine the foundations of European democracy. With four years left in its current mandate, there’s still time for this Commission to move beyond false binaries such as ‘regulation versus innovation’ and ‘privacy versus security’. There can be no prosperity without safeguards, and no true security in a surveillance state. We hope they make the right choice and recommit to protecting human rights.