Webinars and Livestreams

Russmedia Case Implications for Platform Accountability

Russmedia (C-492/23) shows when platforms become GDPR controllers for users' ads. Key implications for platform accountability, DSA interplay and AI “nudify” harms.

Key takeaways

  • A platform was held to be a GDPR controller for a user-posted ad it did not create. Hosting immunity offered no defence.
  • Controllership turns on dissemination architecture. Structuring how content is published, categorised, or ranked can trigger full GDPR controller duties.
  • Director liability is on the table. The ruling, read alongside Wunner (C-77/24), signals that enforcement may reach beyond corporate entities to individuals.

Can a platform escape GDPR accountability by claiming it merely hosts user content? The Court of Justice of the European Union has now answered that question. In its Grand Chamber judgment in X v Russmedia Digital and Inform Media Press (C-492/23), the CJEU ruled that an online marketplace was a controller for personal data in a user-posted advertisement, even though the platform did not create the content itself. The case involved a fabricated ad offering sexual services using a real person’s identity, and the platform’s dissemination mechanisms played a decisive role in the harm.

The implications reach far beyond one Romanian marketplace. The ruling reshapes how we understand platform accountability under GDPR, particularly where AI-generated content, algorithmic amplification, and non-consensual imagery are involved.

To unpack the ruling’s consequences, Digibeetle CEO Joost Gerritsen sat down with Rosalia Anna D’Agostino and Peter Hense from Spirit Legal in a LinkedIn Live session. Here is what they found.

Why the Russmedia ruling changes the controllership debate

Before Russmedia, the conventional compliance narrative offered a convenient binary: active platforms assume controller responsibilities, while passive hosts benefit from intermediary immunity. The Grand Chamber has dismantled that framing.

The decisive question is no longer whether a platform authored the content. It is whether the platform determines the purposes or essential means of processing, including the mechanisms through which content is disseminated. An online marketplace that structures publication, categorisation, and availability of user-generated advertisements exercises control over processing, even if the content originates elsewhere.

Three principles emerge from the ruling. First, full applicability of the GDPR means platforms engaging with EU markets remain subject to all GDPR duties and judicial remedies. Second, intermediary exemptions under eCommerce or hosting frameworks do not override data protection obligations. Third, fundamental rights primacy requires that liability regimes never undermine the protection of personal data as a fundamental right.

AI nudify tools expose the structural gap in platform governance

Rosalia Anna D’Agostino placed the Russmedia ruling in the context of non-consensual image exploitation. AI-driven nudification represents a new category of harm: where earlier abuses required someone to share an intimate image, today’s tools enable unilateral creation of sexualised depictions, often anonymously and at scale.

The governance challenge is systemic, not limited to a single application or company. Low-friction generation tools allow anyone to create sexualised depictions of real individuals. Accountability remains high-friction because responsible entities are often difficult to identify or locate. Platform amplification multiplies the harm, as recommender systems and social feeds spread content far beyond its origin. And enforcement remains fragmented, with jurisdictions split between digital services regulation, criminal law, and obscenity frameworks.

Russmedia offers a partial answer to this structural gap. A platform that provides the stage for dissemination cannot simultaneously claim it bears no responsibility for what appears on it.

Controllership by design: when dissemination architecture triggers GDPR duties

The CJEU analysed how the Russmedia platform structured publication, categorisation, and availability of user-generated advertisements. That analysis translates directly to social media feeds, algorithmic ranking, and recommender systems. As Peter Hense put it: whoever built the stage, raises the curtain, and profits from the ticket sale cannot claim to have nothing to do with the play being performed.

In GDPR terms, the focus falls on infrastructure decisions. Presentation and duration settings, which define how long and in what format content remains available, can indicate determination of processing means. Categorisation and ranking logic, which structure and order visibility, may demonstrate operational control. And the absence of upload-stage safeguards can be decisive where unlawful dissemination results.

The implication is clear: platforms cannot rely on a purely passive self-characterisation if their technical and organisational design actively structures how personal data reaches an audience.

Accountability as preventive governance under Articles 24 and 25 GDPR

The Russmedia ruling does not stop at controllership analysis. It reinforces that the accountability principle under Article 5(2) and the controller duties under Article 24 require operational compliance, not just documentation. Accountability is both evidentiary and preventive.

The judgment gives sharper judicial weight to risk management principles that data protection practitioners have long advocated, particularly where processing involves sensitive data or heightened risk contexts such as sexual imagery. For high-risk features, including generative image tools and public dissemination systems, default configurations and preventive safeguards will receive increased scrutiny from supervisory authorities and courts alike.

The DSA-GDPR interplay: deliberate blindness is not a compliance strategy

The Digital Services Act prohibits general monitoring obligations, but that prohibition does not shield platforms from proportionate preventive measures. The European Data Protection Board’s draft Guidelines 3/2025 on the interplay between the DSA and the GDPR clarify that DSA compliance itself often entails personal data processing, which must respect GDPR requirements independently.

The practical message is not that platforms must monitor everything. It is that they cannot rely on deliberate blindness where risks are foreseeable and preventable. Robust notice-and-action mechanisms, meaningful user verification processes where proportionate, and governance of recommender systems become critical compliance components.

Litigation, collective redress, and director exposure

Beyond regulatory enforcement, the session anticipated growth in private litigation. GDPR provides for judicial remedies, damages, and injunctions. Where harms are systemic, collective redress mechanisms may transform diffuse injuries into structured claims against platforms.

An additional dimension concerns director liability under national tort frameworks. While not a GDPR case itself, the CJEU’s reasoning in Case C-77/24 (Wunner) on cross-border tort liability of company directors underscores that corporate structures do not necessarily insulate individuals from exposure. Enforcement dynamics may therefore extend beyond corporate entities to the people who make governance decisions.

What different stakeholders should do now

  1. Supervisory authorities should assess dissemination design, including ranking systems and default settings, as part of compliance audits. Coordination between GDPR enforcement and DSA supervisory structures will be essential.
  2. Law firms can draw on Russmedia to develop more nuanced pleadings around controllership, preventive duty failures, and accountability documentation.
  3. Platforms and businesses should review governance frameworks beyond notice-and-takedown. Product teams must consider whether dissemination architecture, ranking logic, or default settings strengthen a controllership argument against them.
  4. Universities and policy researchers should examine the open doctrinal questions, including the interaction between Article 85 GDPR and platform duties, to feed research back into practice.

The bottom line

The Russmedia ruling confirms what many privacy professionals suspected but lacked judicial backing for: platforms that structure how personal data reaches an audience are controllers, full stop. Hosting immunity does not erase GDPR responsibilities. Intermediary status does not reduce them.

For platforms deploying AI features, algorithmic ranking, or user-generated content systems, the question is no longer whether GDPR applies. It is whether the governance, safeguards, and accountability structures are robust enough to withstand scrutiny, both from regulators and from courts.

Digibeetle monitors these developments in its EU digital law database and connects them to the broader regulatory landscape via its legislation tracker. To discuss how Russmedia affects your compliance posture, book a consultation.

Frequently asked questions

What did the CJEU decide in the Russmedia case (C-492/23)?

The Grand Chamber ruled that an online marketplace was a GDPR controller for personal data in a user-posted advertisement, even though the platform did not create the content. The judgment establishes that platforms which structure how content is published, categorised, and disseminated exercise sufficient control over processing to qualify as controllers under Article 4(7) GDPR.

Does hosting immunity protect platforms from GDPR obligations?

No. The Russmedia ruling makes clear that intermediary liability exemptions under eCommerce or hosting frameworks do not limit GDPR obligations or remedies. A platform that benefits from hosting immunity can still be a GDPR controller and face full data protection duties, including responding to data subject requests and paying damages.

How does the Russmedia case affect platforms that use AI or algorithmic ranking?

The ruling’s logic extends naturally to any platform that uses algorithmic ranking, recommender systems, or AI features to structure content visibility. If a platform’s technical design determines how personal data is disseminated, that platform may be a controller for the resulting processing, regardless of whether the content was user-generated or AI-generated.

What is the relationship between the DSA and GDPR after Russmedia?

The Digital Services Act and GDPR operate in parallel, not as alternatives. The DSA’s prohibition on general monitoring does not prevent proportionate preventive measures. The EDPB’s draft Guidelines 3/2025 confirm that DSA compliance itself often involves personal data processing that must independently comply with GDPR. Platforms cannot use DSA compliance as a shield against GDPR accountability.

icon_smile

Try Digibeetle with your team for free

Start your discovery of data protection documents with Digibeetle.