and Jannigje Bezemer and Laura Grant.
I. Introduction: what is the Digital Services Act?
The Digital Services Act (‘DSA’) is a landmark EU law on Internet regulation, adopted on 5 July 2022 and centring around the principle of “what is illegal offline should also be illegal online”. As online platforms become an increasingly integral part of our daily lives across all aspects of society, updated rules concerning the transparency and accountability of the online space are required. Prior to the DSA, Member States were introducing national laws on the responsibilities of intermediary online services in tackling illegal content, online disinformation or other societal risks. As the European Parliament and Council of the European Union noted upon the Act’s adoption, these diverging national laws negatively affected the internal market. Therefore, the DSA aims to harmonise fragmented national laws on intermediary services and create Union-wide rules for a safe, predictable and trusted online environment.
The DSA aims to protect consumers’ fundamental rights and create a transparent accountability framework for online platforms, protecting users from intrusive data collection and advertisements using profiling. This is unlike the DMA, which puts its primary focus on gatekeepers and their obligations. Commenting on the adoption of the Act, Executive Vice-President for a Europe Fit for the Digital Age Margrethe Vestager reinforced this key feature noting that, “[it] enables the protection of users’ rights online.”
This Act is the focus of our second installation in our blog post series on the EU’s Digital Package, with our last blog post focusing on the Digital Markets Act. We will provide a detailed insight into the DSA’s provisions and how they will positively impact personal data protection and the fundamental rights of users, whilst considering how they interplay with the GDPR. This blog post will focus on the provisions concerning profiling, algorithmic recommendations, and targeted advertisements.
II. Key definitions: “profiling”, “recommender systems” and “targeted advertising”
As artificial intelligence becomes more widely used, this raises concerns regarding the use of personal data and behavioural tracking on online platforms. If you are an online user, you are probably familiar with content and adverts appearing on online platforms that seem “perfect” for you. How did Instagram know you would like to see images of tropical islands on your Discover page? How did Facebook know to show you a new watch to buy? This is the daily effect that profiling, recommender systems and targeted advertisements currently have on users, as they use personal data to the platform’s benefit. As we will focus on these three concepts throughout the blog, brief definitions will be provided.
What is “profiling”?
The DSA does not define profiling under Article 2 but takes its meaning from Article 4(4) GDPR. Profiling is defined under the GDPR as, “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.” In the context of the DSA and this blog post, profiling can be understood as the processing of personal data using algorithms to predict user’s online preferences and their browsing habits.
What are “recommender systems”?
Recommender systems can be understood as an algorithmic system that uses the user’s profile to shape what they see online. They have a major impact insofar as they can promote radical or extremist content, giving online platforms the power to shape societal discourse and opinion. In terms of how they work, there are 3 types of recommendation engines: (1) collaborative filtering, (2) content-based filtering, and (3) a hybrid of both. Succinctly, these engines gather and analyse user data, creating recommendations based on the history of the user and similar user behaviours. They combine data taken from explicit interactions such as past activity, reviews, profile information like your age, gender, etc. with information taken from implicit interactions like location and the devices/browser you use, etc. As noted in Paragraph 52(c) DSA, recommender systems “algorithmically suggest, rank and prioritise information.” This indicates that a main aim with using recommender systems is to tailor items to one’s personal taste on a certain platform to enhance the user experience.
What is “targeted advertising”?
Targeted advertising in this context can be understood as a subset of a recommender system. Data about users is gathered to recommend products and services the user is likely to purchase in online advertisements. There are three types of targeted advertisements: (1) contextual – targeting users based on the content of the website they’re on; (2) segmented – targeting users based on known characteristics of the user and provided by the user, such as age, sex, location and based on what other similar users usually prefer to purchase; (3) behavioural – targeting users after observing their online behaviour (website visited, clicks, etc.) and creating a tailored profile.
III. The General Scope and Content of the DSA
This section will briefly explore the general scope and content of the DSA, especially in the light of the Digital Package and its sister Act, the DMA. The DSA concerns a wider scope of online intermediaries and platforms that connect consumers with goods, services, and content. It deals with simple websites to large infrastructure services and online platforms. Rather than focusing on the significance of companies in the market, the DSA looks at the regulation of companies based on their activities. It builds on the e-Commerce Directive’s (2000/31/EC) principles, which will still be applicable: the goal is to complement and harmonise existing legislation. The DSA will provide ex-ante rules specifically for these online services.
The Act’s scope extends to all online intermediary services (such as Internet access providers and domain name registrars). Under this umbrella of “intermediary services”, the following are included: online hosting services (such as cloud providers), which includes: online platforms (such as online marketplaces, app stores, social media platforms) and very large online platforms/very large online search engines (‘VLOPs’ and ‘VLOSEs’), defined in Article 25 as, “online platforms which reach a number of average monthly active recipients of the service in the Union equal to or higher than 45 million.” Examples of likely VLOP/VLOSEs are the likes of Meta’s Facebook and Instagram, Amazon, and Google Search, to name a few.
The most significant Chapter for users is Chapter III. Within Chapter III are a range of transparency measures that, from the users’ perspective, will better protect them online through new safeguards. These measures will enable them to challenge platforms’ content moderation decisions and access dispute resolution mechanisms. This means that when content is removed, online platforms must provide an explanation of why it was removed to the uploading user. Not only that, if users seek information on the terms and conditions regarding the restrictions on data usage, this must be publicly available in an easily accessible format.
The DSA will create an easy mechanism for users to report illegal online content, goods or services. Additionally, in line with the “Know Your Customer” procedures, online platforms must store information about traders to facilitate the monitoring of illegal content/services in addition to employing procedures for removal. Succinctly, online services must take a pro-active approach in acting against illegal content, providing information to the users, and documenting their activities. This will create a more trustworthy environment and prevent the abuse of online marketplaces by illegal traders.
The principle of greater transparency seen throughout the Act also extends to obligations concerning online advertisements. As aforementioned in Section II, targeted advertising is commonplace in today’s online world. Article 24 DSA stipulates that online platforms must provide in clear, easily understandable language that what users are seeing is:
- An advertisement
- Who displays the advertisement and
- Why the user is seeing the advertisement
In addition to this, Article 24(1)(c) states that online platforms should clearly explain to users how it was determined that they would see that particular (targeted) advertisement. Furthermore, under Articles 24(b) and 24(3) respectively, targeted advertising to minors or targeted advertising using certain special categories of sensitive personal data (ethnicity, political views, sexual orientation, etc. as per Article 9(1) GDPR) will be banned outright. Further to this, online platforms using recommender systems for advertising must clearly indicate the main parameters of these systems and options (if any) for users to modify or influence them under Article 24(a).
VLOPs/VLOSEs will face more rules due to their influence and size. These large platforms using recommender systems will need to provide users with transparent information on how they use algorithms and offer alternative choices not based on profiling as per Articles 24(a) and 29. This means that Instagram, Facebook and Netflix feeds (amongst many other VLOPs/VLOSEs) must offer users the choice to view the platform not based on a recommender system. For example, with Instagram, this will mean viewing the feed chronologically and not in order of assumed preference. Furthermore, VLOPs/VLOSEs must analyse system risks, establish a public repository with detailed information on online advertisements, designate a dedicated compliance officer, and provide access to the data upon request of the Commission. They must also follow the same restrictions on targeted advertising under Article 24, which includes limiting or adjusting the presentation of advertisements in association with the service they provide under Article 27(1)(d). Users can also scrutinise the actions of VLOPs/VLOSEs through independent audit and research reports.
IV. Critique surrounding the Act and Significance to the GDPR: how will the respective instruments interplay?
This section will provide an overview of the critique presented by relevant bodies concerning the Act’s adherence to ensuring data protection when it was at the Proposal stage, particularly regarding profiling and targeted advertising. Simultaneously, it will explore how these provisions of the DSA interplay with the GDPR. As noted, many provisions throughout the DSA reflect one of the core data protection principles set down in Article 5(1)(a) GDPR, namely the transparency principle. The reliance on this principle is indicated in Articles 24 and 30 DSA, which require the transparency of platforms regarding online advertisement activities.
The European Data Protection Supervisor (‘EDPS’) published an Opinion on the DSA in February 2021. Paragraph 67 of the Opinion noted that to ensure meaningful transparency, additional clarification on parameters must be disclosed, going as far as requiring transparency on each criteria used when targeting advertisements. The EDPS urged the co-legislator to consider additional rules going beyond transparency to deal with the risks associated with online targeted advertising. Paragraph 69 indicated a preference for stricter regulation favouring less intrusive forms of advertising, recommending a prohibition on targeted advertisement based on pervasive tracking. Pervasive tracking can range from tracking users via cookies to more extreme cases which use persistent online tracking techniques.
Subsequently, the European Data Protection Board (‘EDPB’) also released a Statement in November 2021 regarding the same concerns with the lack of fundamental rights protection and risks of inconsistencies within the Digital Package. Notably, it also highlighted that the DSA should regulate online targeted advertising more strictly, re-iterating the EDPS’s call for a ban on pervasive tracking. The recommendation also urged the co-legislature to consider prohibitions on profiling children. The recommendations of the EDPS and EDPB were mostly met in the final text of the Act and complement the GDPR.
As mentioned in Section III of this blog, the main provisions on targeted advertising and recommender systems are found under Article 24, which initially may seem to go further than the GDPR. In terms of profiling, Article 22 GDPR holds that decision-making solely based on automated processing, including profiling, if it legally or significantly similarly affects the individuals, is prohibited. Exceptions are allowed when the explicit consent of the individuals is obtained, when Union/Member State law allows it or when it is necessary for a contract to function. Without meaningful human involvement in the decision or necessity as per the exceptions under Article 22(2) or explicit consent of the user, decision-making solely based on profiling is outlawed. To clarify, profiling per se is not prohibited under the GDPR. As profiling is defined in Article 4(4) GDPR as a personal data processing activity, the data controller can carry out profiling if it complies with each requirement for such an activity. Whilst academic debate has noted the ambiguity of Article 22 GDPR’s wording as it appears to blur the lines of definition between profiling and decision-making, the general consensus is that they should be interpreted as two separate processes. The Article 29 Data Protection Working Party in its 2017 Guidelines notes that “automated decision-making has a different scope and may partially overlap with or result from profiling”.
However, the DSA does not maintain the same threshold of “solely based on automated processing, including profiling” for decision-making unlike Article 22(1) GDPR, indicating that profiling even with meaningful human involvement is prohibited when concerning minors or special categories of personal data as per Articles 24(3) and 24(b). Arguably, the DSA goes a step further than the GDPR with an outright prohibition on profiling in this instance. This means that explicit consent, which is anyhow problematic for the reason we explained here, will not be the green card online platforms used before to process special categories of data or of minors. There are seemingly no exceptions to these provisions: they will arguably create a watertight framework that further protects users and their fundamental rights to privacy and data protection.
It must be noted that the EDPB and EDPS recommendations to ban the profiling of children is not new. This recommendation has been included in the aforementioned Guidelines issued by the Article 29 Working Party, as the GDPR does not incorporate an absolute prohibition on profiling children. Recital 71 states that, “such measures should not concern a child.” This special protection given to children under the GDPR is also highlighted in Recital 38, which states that such protection should apply, “to the use of personal data of children for the purpose of marketing or creating personality or user profiles.” However, because this is not clearly reflected in the GDPR and only in the Recitals, this is not considered an absolute prohibition. Due to the lack of an absolute prohibition, profiling children and showing personalised advertising to them can and will form part of every platform’s business model. Once the DSA will enter into force, the provisions in Article 24 will rectify what the legislator failed to do in the GDPR: incorporate in its text an absolute prohibition on the profiling of children and using sensitive data.
V. Conclusion: a package to collectively improve the online space?
Unlike the DMA, an agreement was reached with the DSA after only 16 hours of negotiations. The agreed text of the DSA will introduce stricter rules for VLOPs, establish strict requirements for removal of illegal content via transparency obligations, and push for elevated levels of privacy and safety for minors without the need for additional personal data processing to prove minor status. Algorithmic recommendations will no longer be forced upon the user, with alternative recommendations not based on profiling available within VLOPs/VLOSEs. When it comes to targeted advertising, the personal data of minors and profiling of individuals based on special categories of personal data as aforementioned will be forbidden. The significance of this Act on users’ fundamental rights cannot be understated, with Amnesty International stating that “[it] moves us towards an online world that better respects our human rights by effectively putting the brakes on Big Tech’s unchecked power.”
As aforementioned, as of 5 July 2022, the European Parliament and Council of the European Union have voted to adopt the Digital Services Act: it will come into force 20 days after publication, which should be in autumn of this year. To conclude this two-part blog series on the EU’s digital package, it is hoped that the introduction of the DMA and DSA will help collectively further protect user’s fundamental rights and create a safer online space, enabling users to choose how much personal data they want to share, hold online platforms to account, and create more options for online services whilst encouraging an innovative and fairer marketplace. Despite critiques from independent authorities on some aspects of both Acts’ substance, particularly with respect to lack of clarity surrounding certain provisions, we will have to wait and see the practical implications of both Acts when they become directly applicable across the EU.
The views expressed in this blog are those of the authors and not necessarily those of EIPA.