In our latest LinkedIn Live session CEO Joost Gerritsen sat down with Rosalia Anna D’Agostino and Peter Hense from Spirit Legal to discuss the far reaching effects of the recent CJEU Russmedia case. The discussion revolved around “nudify” tools and AI-generated imagery as a stress test for what the CJEU has now clarified: platform accountability under the GDPR cannot be neutralised by invoking hosting immunity narratives.
The Russmedia case perspective
The Court of Justice of the European Union’s Grand Chamber judgment in X v Russmedia Digital and Inform Media Press (C-492/23) re-frames how we should understand platform liability both from an eCommerce and a GDPR perspective. Online services that provide the “stage” for dissemination can face full GDPR duties and remedies even where intermediary liability exemptions would traditionally be invoked. In essence, an online marketplace was found to be a controller for the personal data contained in an advertisement published by one of its users. The ad falsely represented a woman offering sexual services and was further distributed on the internet although the platform took down the content.
This position has immediate consequences for privacy professionals, supervisory authorities, law firms, and businesses navigating generative AI misuse and non-consensual image exploitation. Digibeetle monitors these developments in its EU digital law database and connects them to the broader EU regulatory landscape via its legislation tracker.
The problem scenario: nudify tools, virality, and regulatory asymmetry
Rosalia situates nudify applications within a broader pattern of non-consensual image exploitation. The technological leap is not merely about aesthetic manipulation; it provides turnkey exploitation machinery. Where earlier harms required persuading someone to send an intimate image, AI-driven nudification enables unilateral creation, often anonymously, and at scale. The ecosystem looks like this:
- Low-friction generation. AI tools can create sexualised depictions of real individuals without consent.
- High-friction accountability. Responsible entities may be hard to identify, locate, or subject to enforcement.
- Amplification through platforms. Even where content is generated elsewhere, dissemination via social platforms and recommender systems multiplies the harm.
- Divergent enforcement models. Some jurisdictions rely on digital services regulation, others on criminal or obscenity law frameworks.
The session emphasised that the governance challenge is systemic. The issue is not one application or one company, but a recurring structural dynamic: services that enable dissemination while attempting to describe themselves as “neutral intermediaries”.
The Russmedia effect: GDPR obligations are not capped by hosting immunity
Peter Hense’s central point is that the Grand Chamber in C-492/23 makes explicit what EU law has long implied. Liability exemptions in the intermediary framework do not limit GDPR obligations or remedies. The case arose from a Romanian online marketplace, yet the reasoning extends beyond that context.
The conventional compliance narrative suggested a binary: active platforms assume responsibility; passive hosts benefit from immunity. The Russmedia judgment destabilises that dichotomy. Under GDPR logic, the decisive factor is whether the platform determines the purposes or essential means of processing (including dissemination mechanisms) not whether it authored the content.
- Full applicability of the GDPR means platforms engaging with EU markets remain subject to GDPR duties and judicial remedies.
- Intermediary exemptions do not override data protection. Hosting status does not erase controller responsibilities where control over dissemination exists.
- Fundamental rights primacy means that liability regimes must not undermine the protection of personal data as a fundamental right.
Controllership by design: dissemination architecture as a compliance trigger
A key insight from the session is that controllership may arise through dissemination architecture. The CJEU analysed how the platform structured publication, categorisation, and availability of user-generated advertisements. In practical terms, this analysis extends naturally to social media feeds, algorithmic ranking, and recommender systems.
As Peter Hense summarised, whoever built the stage, raises the curtain, profits from the ticket sale cannot claim to have nothing to do with the play being performed
. In GDPR terms, the focus is on infrastructure decisions:
- Presentation and duration settings: defining how long and in what format content is available can indicate determination of processing means.
- Categorisation and ranking: structuring and ordering visibility may demonstrate operational control.
- Upload-stage safeguards: inadequate preventive measures can be decisive where unlawful dissemination results.
The implication is that platforms cannot rely on a purely passive self-characterisation if their technical and organisational design actively structures dissemination.
Accountability as preventive governance: Articles 24 and 25 GDPR
The Russmedia effect is not limited to controllership analysis. The judgment reinforces that the accountability principle under Article 5(2) GDPR and the controller’s duties under Article 24 require operational compliance. Accountability is both evidentiary and preventive.
The talk hosted by Digibeetle highlighted that this is aligned with long-standing risk management principles in data protection practice. However, the judgment gives sharper judicial weight to those principles, particularly where processing involves sensitive data or heightened risk contexts such as sexual imagery.
For high-risk features, including generative image tools and public dissemination systems, default configurations and preventive safeguards are likely to receive increased scrutiny.
The DSA–GDPR interplay and the limits of “no general monitoring”
The discussion also addressed the relationship between GDPR obligations and the Digital Services Act (DSA). While the DSA prohibits general monitoring obligations, it does not prevent proportionate preventive measures. The European Data Protection Board’s draft Guidelines 3/2025 on the interplay between the DSA and the GDPR clarify that DSA compliance often entails personal data processing, which must itself respect GDPR requirements.
The practical message is not that platforms must monitor everything. Rather, they cannot rely on deliberate blindness where risks are foreseeable and preventable. Robust notice-and-action mechanisms, meaningful user verification processes where proportionate, and governance of recommender systems become critical components of compliance.
Litigation, collective redress, and director exposure
Beyond regulatory enforcement, the Digibeetle Live session anticipated growth in private litigation. GDPR provides for judicial remedies, damages, and injunctions. Where harms are systemic, collective redress mechanisms may transform diffuse injuries into structured claims.
An additional development discussed in the episode concerns director liability under national tort frameworks. While not a GDPR case, the CJEU’s reasoning in Case C-77/24 (Wunner) concerning cross-border tort liability of company directors underscores that corporate structures do not necessarily insulate individuals from exposure in certain contexts. The strategic implication is that enforcement dynamics may extend beyond corporate entities.
Practical implications for key stakeholders
- Supervisory authorities may increasingly assess dissemination design, including ranking systems and default settings, as part of compliance audits. Coordination between GDPR enforcement and DSA supervisory structures will be central.
- Law firms can draw on Russmedia to develop more nuanced pleadings around controllership, preventive duty failures, and accountability documentation. Digibeetle supports law firms with cross-referenced case law and structured analysis tools.
- Businesses and platforms should review governance frameworks beyond notice-and-takedown. Product teams must consider whether dissemination architecture, ranking logic, or default settings strengthen a controllership argument. Digibeetle’s resources for businesses translate jurisprudential signals into operational guidance.
- For supervisory authorities, universities, and policy researchers, open doctrinal questions remain, including the interaction between Article 85 GDPR and platform duties. Digibeetle collaborates with universities to support research-to-practice feedback loops.
Drawing actionable information from the Russmedia case
The Russmedia effect will continue to shape enforcement, litigation, and product governance, especially where AI features intersect with public dissemination. Digibeetle is an expert-curated platform designed to translate complex EU digital law developments into structured, cross-referenced insights.
Explore the EU digital law database and the legislation tracker to trace how case law, regulation, and supervisory practice interact. You can start a 30-day free trial here: https://app.digibeetle.eu/register/.
If you would like to discuss how Russmedia affects your compliance posture or supervisory approach, you can book a consultation. In an environment defined by rapid digital regulation and evolving jurisprudence, access to daily updates and actionable intelligence is a strategic advantage.