The Right to Information Evolved: GDPR and the DSA’s Transparency Landscape

Blog
Written by

Florina Pop

and Leila Debiasi

In the European regulatory landscape, the Digital Services Act (DSA) and the General Data Protection Regulation (GDPR) stand as two pillars of digital governance, each with a strong emphasis on transparency and the protection of fundamental rights. In this regard, while the GDPR focuses on safeguarding personal data and ensuring that individuals have clear and accessible information about how their data is processed, the DSA extends transparency obligations to the digital ecosystem. At their core, both regulations aim to empower individuals by providing insight into how the digital systems function. This alignment underscores their shared objectives of fostering trust and accountability in the digital environment. As already discussed in our previous blogs, concepts such as profiling, recommender systems, and transparency play a crucial role in shaping online experiences.

Many of the DSA’s provisions, particularly those related to transparency and accountability, closely mirror GDPR principles but apply them to digital platforms and online intermediaries more broadly. It is enough to consider that the mere definition of profiling in Article 26(3) of the DSA directly refers to the text of the GDPR, as does the one of special categories of personal data in the same article, symbolising the synergy between the two regulations. The topic we will tackle in this blog is related to the transparency requirements for online platforms and how these provisions on transparency in GDPR and DSA relate to each other.

A Shared Commitment to Transparency 

It is a foundational principle in both regulations; while the GDPR enshrines transparency as one of the core principles of data processing under Article 5(1)(a), the DSA extends this obligation beyond personal data, requiring online platforms – and especially Very Large Online Platforms (VLOPs) – to disclose how they operate, moderate content, and craft user experience.

Going more into detail, the GDPR establishes a clear right to transparent information, for example through:

  • Article 12 mandating that information provided to individuals be ‘concise, transparent, intelligible, and easily accessible’;
  • Articles 13 and 14, outlining obligations to inform individuals about data collection and processing, including the information about the data controller relying on automated decision-making processes;
  • Articles 15–22, granting rights derived from transparency, including access to data, rectification, and objection

The DSA, on the other hand, introduces new transparency obligations, some of which are:

  • Article 26 on advertising transparency, including disclosure on why a user is being presented a particular advertisement;
  • Article 27 compelling VLOPs to disclose how their recommender systems influence content ranking and visibility;
  • Article 42 imposing transparency reporting obligations on platforms regarding their content moderation practices, and risks related to such moderation.

The transparency principle under the GDPR remains a significant challenge. Research highlights how barriers to transparency can be essentially reduced to three areas: intentional concealment by data controllers, gaps in technical literacy on the data subjects’ side, and the challenge of translating complex algorithmic processes, often expressed in mathematical models or technical jargon, into concepts that can be understood at a human level. This issue arises because many automated decision-making models rely on intricate statistical models, such as machine learning algorithms, that might not be easily interpretable by the average user. Thus, individuals may struggle to comprehend how their data is being used to make decisions affecting them.

In a way, Article 13 GDPR counters the first obstacle by obliging data controllers to inform data subjects on when, why, and by whom their data is being processed, while Article 12 GDPR tackles gaps in digital literacy by requiring information to be laid out in clear and plain language. Additionally, Article 13(2)(f) seeks to address the challenge of opacity by requiring controllers to provide meaningful information about the logic involved in automated decision-making. However, this provision lacks specificity, as it does not clearly define what constitutes a meaningful explanation or how much detail must be provided. Even if it has been argued that the GDPR does not fully resolve the third barrier, should we then throw in the towel on this?

The DSA might come to the rescue in this regard. Taking for example the rules regarding the content of the platforms’ terms and conditions, undoubtedly the legislator aimed to clarify what meaningful information means, compelling platforms to equip users with information about the specific tools used for content moderation, procedures regarding algorithmic decision-making, and complaints-handling systems. Most importantly, in Article 14 DSA it mandates that this is carried out ‘in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format’. The wording of the provisions itself shows a newly acquired awareness of the complexity of the online environment and aims to reduce the mismatch between it and the knowledge level of users by adding the user-friendly and unambiguous requirements.

In light of the above, we understand how transparency, in both regulations, is not only about disclosure, but about effective communication with the user of the digital platforms. In fact, the aim is to make the individuals understand how their data is processed, how information is presented to them, and how this ultimately affects their choices. To make this happen, information needs not only to be disclosed, but to be made meaningful and actionable. This evolving notion of transparency is captured by the growing emphasis on the right to explanation, explored in the next section.

The Evolution of Information Rights

When the GDPR was adopted, a surge of academic arguments arose arguing whether or not these new provisions represented the emergence of a ‘right to explanation’, opposed to a more passive right to information. While not unanimously, this right is believed to be evident from an analysis of the plain text of the GDPR. In fact, the data controller’s obligation to provide meaningful information in Articles 13 to 15 GDPR has to be interpreted in relation to the data subject, meaning that information should be meaningful to them, a human with reasonable levels of technical expertise (so, realistically, pretty low). How is it determined if the information is meaningful for the data subject? One way to answer this question is by assessing whether such information is functional, in the sense that the information provided enables the data subject to take a certain action. Going into more depth, a minimum threshold of functionality of the information is whether it is enough to facilitate the exercise of the data subject’s rights guaranteed by the GDPR.

In practical terms, Articles 13 and 14 GDPR impose notification duties on data controllers on how, why, by whom, and for how long the individual’s personal data is being processed. At the same time, the data controller has to signal the existence of automated decision-making in the data processing operation and at least meaningful information about the logic involved (Article 13(2)(f) and Article 14(2)(g)). This serves as a basis for the exercise of rights, such as the right to access in Article 15 GDPR and the right not to be subject to automated decision-making in Article 22 GDPR. Article 15 GDPR enables the data subject to request access to their data, which would not be possible should the controller not disclose that processing is being carried out. In the same way, if the subject was not aware of the extent of processing, they would not be able to exercise their right to contest a decision based solely on automated means under Article 22 GDPR.

How do we then relate this back to the DSA? As we discussed above, the DSA draws heavily on the GDPR, especially concerning transparency concepts. Unsurprisingly, analysing the text of the DSA, similar conclusions can be drawn. Starting from Recital 68 DSA, tackling online advertisements, which states that platforms must provide a ‘meaningful explanation of the logic used’ reflects precisely Article 13(2)(f) and Article 14(2)(g) in the GDPR. The legislator, when drafting the text of the DSA, was very much aware of the need to provide clear guidance on what ‘meaningful explanation of the logic used’ implies. Over the years, since GDPR entered into force, some Data Protection Authorities have come up with different solutions. One of these recommendations was very much along the lines of Recital 68 and 45 of the DSA, to add ‘standardised visual or audio marks, clearly identifiable and unambiguous for the average recipient of the service’ or ‘graphical elements in their terms of service, such as icons or images, to illustrate the main elements of the information requirements’, all these to ensure that the information is salient. Closely following, under Recital 70 DSA, information on recommender systems must be ‘easily comprehensible’ and explain how content is ranked and displayed. Moreover, platforms are expected to ‘ensure that […] the recipients of the service are able to identify’ how and why a certain advertisement is being presented to them, as well as setting out ‘in plain and intelligible language the main parameters of their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters’. This, de facto, clarifies the concept, as derived from the GDPR, that information serves as a tool for users to exercise their agency.

It is clear that the legislator’s intention was to refine and strengthen the transparency obligations imposed on platforms, ensuring that information provided to users is not only available but also accessible and actionable. This approach builds on findings from case law, as well as the guidelines and recommendations of DPAs, the EDPS, and the EDPB. While one could argue that the right to information under Articles 12 to 14 GDPR was already designed as a right to explanation, the DSA specifically targets algorithms that may not meet the threshold of automated decision-making under Article 22 GDPR. According to the GDPR, data controllers must inform individuals about the existence of automated decision-making and provide ‘meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing’ (Article 13(2)(f) and Article 14(2)(g) GDPR). However, this requirement applies only when such automated decision-making produces legal effects or similarly affects the individual concerned. Recommender systems and other platform-driven algorithms, such as those determining content ranking, advertising targeting, and content moderation, do not necessarily qualify under Article 22. In fact, they may not produce legal or similarly significant effects as per the interpretation given by the Article 29 Working Party, but a case-by-case assessment shall be carried out by the controller. However, due to their growing impact on users’ autonomy and personal data processing, the DSA expands on these transparency requirements. In practice, the DSA complements Articles 13 and 14 GDPR by ensuring that platforms disclose key details about how their algorithms function, regardless of whether they meet the GDPR’s threshold. In fact, under the DSA, platforms are now required to provide users with comprehensive explanations of their recommender systems and their main parameters, ensuring that transparency obligations remain effective even in cases where GDPR provisions would not otherwise apply.

Overlapping Obligations – Coordinated Enforcement

Given the strong transparency obligations embedded in both DSA and GDPR, questions arise on who is responsible for enforcing them, and to what extent enforcement mechanisms may overlap. While Section 1 of the DSA designates Digital Services Coordinators (DSCs) as enforcement bodies at national level, could Data Protection Authorities (DPAs) also play a role when it comes to violations of transparency obligations?

We have seen how a lack of transparency in online platforms’ recommender systems or advertising practices may also result in violations of the GDPR. An example could be the case in which a platform fails to provide clear and understandable information on how personal data is used in targeted advertising, thus breaching Article 12–14 of the GDPR requiring transparent communication regarding data processing. At the same time, it is breaching Article 26 DSA requiring online platforms to disclose detailed information about the actual logic involved in the online advertising systems.

This intersection could suggest that national DPAs could in some cases handle complaints related to the transparency obligations when they involve the processing of personal data by platforms. This could bring forward an efficient enforcement mechanism, given that DPAs are already well-equipped to tackle transparency violations under the GDPR, while DSCs are still in their early stages. Still, this approach has its limitations. First of all, we must consider that not all DSA transparency obligations involve personal data, as many focus on systemic risk, content governance, and other issues. For these cases, certainly enforcement will remain in the hands of DSCs, and to the European Commission when concerning VLOPs.

Another counterargument to the idea of DPAs tackling DSA violations is one of legitimacy. If a certain authority has been mandated by the legislator to deal with specific matters, would it not be questionable if enforcement authorities diverged from this mandate? In principle yes, but things are not always clear-cut. The recent case against Meta (C-252/21 Meta Platforms Inc. and Others v Bundeskartellamt) is a great example of how regulatory issues are inherently cross-sectoral, in that case involving both competition law and data protection law. The Court of Justice ruled that a competition authority could assess GDPR breaches as part of its enforcement of competition law, even though data protection enforcement falls under the jurisdiction of national DPAs. This logic could be extended to the DSA–GDPR overlap. When a transparency breach could be assessed under both DSA and GDPR, i.e. when it involves personal data, then both DPAs and DSCs could have jurisdiction. Ultimately, this reinforces the idea that in order to ensure that digital platforms are held accountable, there is a pressing need for coordinated and cross-regulatory enforcement.

Moving Forward

We have seen how the intersection between the DSA and the GDPR highlights a broader evolution in the EU digital regulatory framework: transparency obligations are no longer just about disclosure, but most crucially about empowering users with meaningful and actionable information.

The growing emphasis on transparency as a functional right to exercise control over one’s personal data has led to the emergence of a ‘right to explanation’; this results in a regulatory landscape shifting towards a model that gives priority to user agency, requiring platforms to provide explanations that facilitate meaningful understanding.

However, this complementarity in transparency obligations also brings forward questions of enforcement. The DSA establishes a dedicated enforcement mechanism through national DSCs, and a directed oversight by the Commission on VLOPs and VLOSEs (Very Large Online Search Engines). However, the intertwining of the obligations suggests that DPAs also play a role, particularly where transparency breaches relate to the processing of personal data. Recent case law further highlights how regulatory oversight is best to be adapted to ensure effective enforcement across different legal frameworks.

Ultimately, the interaction between the two regulations underscores the increasing complexity of digital regulation, calling for a coordinated approach to enforcement. The DSA and GDPR, rather than being seen as separate regimes, should be understood as complementary tools in a broader effort to foster transparency, accountability, and user rights in the digital age.

More on This Topic

If you’re interested in learning more about data protection, digital policy and cyber security, please have a look at our upcoming courses on these topics by clicking the button below:

 

To the course page

Tags Digital policy cyber security and data protection