Newsletter

Privacy Corner

by Eva Jarbekk, Trygve Karlstad and Luca Tosoni

Published:

Security cameras mounted on wall.

New year, new opportunities? Absolutely, plenty of them! The EU is constantly introducing new regulations, but part of our daily routine also involves the interpretation of existing (GDPR) rules. Later this week, a new EU ruling on third-country transfers will be issued, potentially bringing this topic back into focus very soon. Parts of the holiday season have been relatively quiet on the privacy and tech fronts. Therefore, in this year's first newsletter, we will instead take a deep dive into two specific topics.

First, is the EDPB's new Opinion on AI, released the week before Christmas. Below, you will find an overview of its most crucial parts. I am confident that this will be highly relevant for many. Second, we will explore what the new cookie regulations entail. A client recently asked if there is a need for separate consents for different marketing channels – and as far as I can see, the answer is actually no. It is possible to "bundle" marketing across several social media platforms in one consent, as long as the purpose of the activity is the same and the actors involved are specifically mentioned.

But what about trackers/cookies that serve multiple purposes at the same time? Read more about that below.

An overview of EDPB's Opinion on AI-models

by Eva Jarbekk and Trygve Karlstad 

Just before Christmas, the European Data Protection Board (EDPB) published its much discussed and somewhat controversial "Opinion 28/2024 on certain data protection aspects related to the processing of personal data in the context of AI models." This Opinion is significant as it highlights the privacy considerations that limit both the use and development of AI models. 

An EDPB Opinion is not legally binding for GDPR compliance, practice from European regulatory authorities indicates that such Opinions are emphasized to ensure harmonized enforcement of GDPR across Europe. Where there has been limited legal guidance on how privacy limits the use of AI systems, the Opinion will likely hold substantial weight in interpreting GDPR. 

Whilst there are specific procedural opportunities for an Opinion to become binding, there is a discussion to be had on whether an Opinion is the most appropriate solution for how the EDPB should approach the issue of how data privacy should limit the development of AI. On one hand, the Opinion will have significant consequences for how Europe will keep up in the global competition for developing AI models, and overly strict regulation could hinder innovation. On the other, one might argue that the EDPB should have developed a "Guideline," which is typically developed in a more comprehensive process where more parties and stakeholders have the opportunity to be heard, unlike an Opinion. Enough about that for now. 

Lack of predictability?

It is important to make clear that the Opinion does not provide many definitive conclusions. The EDPB states that many questions, especially those related to whether legitimate interest can serve as a lawful basis for processing, must be determined based on a concrete overall assessment of the facts in each individual case. The Opinion therefore grants national data protection authorities discretion to decide on the appropriate solution in individual cases. It is therefore likely that we will see varying degrees of enforcement across Europe, which is broadly undesirable as it lessens legal predictability, but also leaves greater room for innovation where authorities permit it.

Do AI-models store personal data?

Another central and widely discussed topic prior to the Opinion was whether a Large Language Model stores personal data. This is significant for determining which parts of the GDPR are applicable to the processing activities. As we discussed in another newsletter this fall, there are many differing opinions on this matter.

The EDPB highlights that AI models trained with personal data cannot always be considered anonymous. Whether they are anonymous or not must be assessed in each individual case. This means a significant amount of work for many of us to document such assessments.

For an AI model to be considered anonymous, two conditions must be met. Firstly, there must be an insignificant likelihood of direct (including probabilistic) extraction of personal data regarding individuals whose personal data were used to develop the model.[1]  This refers to personal data that could be accessed through hacking or other cyberattacks against the AI model. Additionally, there must also be an insignificant risk of obtaining, intentionally or not, such personal data from queries. In making this assessment, one must consider all measures that can reasonably be expected to be taken by the data controller or another person.[2]

Legitimate Interest as a lawful basis for processing personal data in AI models

A central question is whether a data controller can use "legitimate interest" as the legal basis for processing personal data in AI. This requires an assessment as set out in the relatively recent Guideline on legitimate interest, which mandates a three-step test. Firstly, there must be a legitimate interest; this interest must be necessary, and finally, a balancing test must be conducted between the data controller's legitimate interests and the fundamental rights and freedoms of the data subjects.

Nevertheless, the EDPB provides clear guidance on the specific considerations applicable to the use of legitimate interest as the legal basis for AI. Firstly, the EDPB sets a high threshold for what constitutes a "necessary" amount of personal data for use in an AI model. Therefore, it is essential to consider less intrusive alternatives that could achieve the same purpose.

Additionally, there are several measures that can minimize the privacy impacts of the processing on the data subjects, such as anonymization, pseudonymization, the option to opt-out of data storage, or not training the model on user's personal data. Increased transparency about the model's structure can also mitigate in favour of the data controller. Of course, there will also be strict requirements for technical measures to ensure the security of the model both during development and use, for example by avoiding sensitive personal data during web scraping.

What legal consequences may arise if a data controller if they develop an AI model in violation of the GDPR?

The EDPB highlights that it can be challenging to determine how regulatory authorities should sanction a company that develops an AI model unlawfully, and how this affects the model's legality when it is later used. Therefore, the EDPB outlines three different scenarios to guide the regulatory authorities.

The first scenario is based on a case where the data controller unlawfully processes personal data to develop the model, and this personal data is retained in the model by the same data controller. In such cases, the EDPB believes that the regulatory authorities can impose fines for the entire processing operation, but it must be specifically assessed whether development and implementation are two separate processing activities.

The second scenario builds on the first, except that there is a new data controller implementing the model. One example of this may be a supplier relationship, where the developer has developed the model unlawfully, and enters into a supplier agreement with another party that will use the model. In such a scenario, the EDPB keeps the conclusion more open, and it must be specifically assessed whether the user of the AI system, for example, has complied with their duty of diligence in the assessments of their supplier. This highlights that the acquirer of an AI system should contractually stipulate a clause that ensures protection if a sanction against the provider affects subsequent use.

The last scenario is based on where there has been illegal processing in the development of the AI model, but where the model is anonymized, and the personal data are subsequently processed. Here, the EDPB concludes that the GDPR does not apply to the use of the model, and that the illegality in the training stage does not affect the subsequent processing of personal data. In such cases, it does not matter whether the subsequent processing is performed by the developer of the AI model or by a third party. However, we see again that it is very important to be able to make a sensible judgment about whether a model has anonymous information or if it contains personal data. The EDPB itself emphasizes that regulatory authorities must investigate thoroughly where a data controller claims that the personal data in the model is anonymous.

Does the EDPB show a positive attitude towards AI?

After reviewing the Opinion, it generally appears that the EDPB is not as critical of AI as many might have feared. Some were wondering if there would be an absolute requirement for consent as the basis for processing, but that did not happen.

Right from the outset, the Opinion contains positive comments about AI and the opportunities it offers across various sectors and social activities. It also suggests that anonymizing personal data could be "easier." Furthermore, the document states that the probability of inadvertently or deliberately extracting personal data from "prompts" should be "insignificant." This stance appears to be somewhat less stringent than the existing guideline on anonymization, which stipulate that re-identification should be virtually impossible.

However, it's important to note that the Opinion leaves several key and complex issues unaddressed and unresolved. For instance, it does not discuss the principle of purpose limitation or the extent to which personal data can be further processed for different purposes. Moreover, the Opinion provides only a cursory examination of critical issues, such as the use of sensitive personal data in training AI models. It's crucial to recognize that the Opinion is a response to a specific request from the Irish Data Protection Commission. The EDPB can only address the questions that are posed to it. There is a possibility that the questions were formulated in such a way that not all concerns were covered.

The most important changes that need to be made for cookies now

by Eva Jarbekk and Luca Tosoni 

As most were busy decorating the Christmas tree and wrapping gifts, the government was finalizing the last steps before implementing the new Electronic Communications Act, which was sanctioned in the council on 13 December 2024 and came into effect on 1 January 2025. This left limited time for a holiday break to digest the changes introduced in the new act. Some of these changes are particularly significant for Norwegian businesses and their online presence and must be carefully and swiftly evaluated in the new year.

The new act imposes stricter consent requirements for cookies, aligning Norwegian law with GDPR standards. This marks a significant shift from previous regulations, where consent through browser settings was sufficient. This practice must change following the enactment of the new Electronic Communications Act.

Businesses now need to reconsider their practices for obtaining consent for cookie usage. Setting up a new consent collection practice may be more challenging than many anticipate, however it is crucial to adapt quickly to the new rules, as there will likely be significant focus on this from both users and authorities in the new year.

There are several aspects businesses should examine in how they use cookies, including the wording in the cookie banner and its setup. Most Norwegian websites will at least need to change the current practice where consent is embedded in browser settings and ensure that consent is actively given by users, for example, by clicking "accept cookies" in a banner they see when visiting a website. The banner must also allow users to easily reject the use of cookies.

The information provided in the cookie banner should be more detailed and nuanced than is common today. It is particularly important that this information makes it easy to understand the consequences of any consent given and cookie banners must list all the purposes for which the cookies are used and enable users to give their consent to each of these purposes separately. It must also specify the suppliers with whom the collected data will be shared - for each specific purpose. However, it is not necessarily required to obtain consents for each supplier within the same purpose.

The term "necessary cookies" should be avoided in cookie banners; instead, they should be referred to as "strictly necessary cookies." Furthermore, users should not be asked to accept such strictly necessary cookies, as they do not require consent. The colours, font, and size of the text in the cookie banner on many websites must also be changed to ensure that it does not confuse users or manipulate them into accepting cookies.

If you have trackers/cookies that serve multiple purposes simultaneously, you must ensure that there is a proper legal basis for all purposes. Getting this right can be challenging, especially if some elements are considered "strictly necessary" and others are not.

Going forward, both the Norwegian Data Protection Authority and the Norwegian Communications Authority will have regulatory authority over the new cookie rules. Until now, the Data Protection Authority did not have regulatory authority over the placement of cookies. As the Data Protection Authority now handles the cookie regulatory framework, there will likely be more enforcement in this area, in line with trends in the rest of Europe. It will be interesting to see if the Norwegian Communications Authority provides clearer guidelines on what should be considered "strictly necessary" cookies – if they choose to follow the current EU guidance, the scope would be very limited. It remains to be seen how the Norwegian Communications Authority will handle this.

It will also be interesting to see what happens with current cases about cookies that started under the old rules but have not yet concluded. There are several such cases, and the changes in the law could impact the outcomes. This may not be the Christmas gift many businesses hoped for, but implementing the new rules should be one of their New Year's resolutions if they want to avoid unpleasant surprises in the future.


[1] Opinion 28/2024 p. 2
[2] Opinion 28/2024 p. 2

Do you have any questions?